WorldWideScience

Sample records for universal quantitative models

  1. On the universality of the attribution-affect model of helping.

    Science.gov (United States)

    Reisenzein, Rainer

    2015-08-01

    Although Pilati et al.'s (2014) findings question the strong quantitative universality of the attribution-affect model of helping, they are consistent with a weak form of quantitative universality, as well as with the qualitative universality of the theory. However, universality is put into question by previous studies revealing significant and sizeable between-study differences in the strength of the causal paths postulated by the theory. These differences may in part reflect differences in the type of helping situations studied. © 2015 International Union of Psychological Science.

  2. Cosmic strings in an open universe: Quantitative evolution and observational consequences

    International Nuclear Information System (INIS)

    Avelino, P.P.; Caldwell, R.R.; Martins, C.J.

    1997-01-01

    The cosmic string scenario in an open universe is developed - including the equations of motion, a model of network evolution, the large angular scale cosmic microwave background (CMB) anisotropy, and the power spectrum of density fluctuations produced by cosmic strings with dark matter. We first derive the equations of motion for a cosmic string in an open Friedmann-Robertson-Walker (FRW) space-time. With these equations and the cosmic string stress-energy conservation law, we construct a quantitative model of the evolution of the gross features of a cosmic string network in a dust-dominated, Ω 2 /Mpc. In a low density universe the string+CDM scenario is a better model for structure formation. We find that for cosmological parameters Γ=Ωh∼0.1 - 0.2 in an open universe the string+CDM power spectrum fits the shape of the linear power spectrum inferred from various galaxy surveys. For Ω∼0.2 - 0.4, the model requires a bias b approx-gt 2 in the variance of the mass fluctuation on scales 8h -1 Mpc. In the presence of a cosmological constant, the spatially flat string+CDM power spectrum requires a slightly lower bias than for an open universe of the same matter density. copyright 1997 The American Physical Society

  3. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  4. Quantitative Literacy at Michigan State University, 2: Connection to Financial Literacy

    Directory of Open Access Journals (Sweden)

    Dennis Gilliland

    2011-07-01

    Full Text Available The lack of capability of making financial decisions has been recently described for the adult United States population. A concerted effort to increase awareness of this crisis, to improve education in quantitative and financial literacy, and to simplify financial decision-making processes is critical to the solution. This paper describes a study that was undertaken to explore the relationship between quantitative literacy and financial literacy for entering college freshmen. In summer 2010, incoming freshmen to Michigan State University were assessed. Well-tested financial literacy items and validated quantitative literacy assessment instruments were administered to 531 subjects. Logistic regression models were used to assess the relationship between level of financial literacy and independent variables including quantitative literacy score, ACT mathematics score, and demographic variables including gender. The study establishes a strong positive association between quantitative literacy and financial literacy on top of the effects of the other independent variables. Adding one percent to the performance on a quantitative literacy assessment changes the odds for being at the highest level of financial literacy by a factor estimated to be 1.05. Gender is found to have a large, statistically significant effect as well with being female changing the odds by a factor estimated to be 0.49.

  5. Variable selection in near infrared spectroscopy for quantitative models of homologous analogs of cephalosporins

    Directory of Open Access Journals (Sweden)

    Yan-Chun Feng

    2014-07-01

    Full Text Available Two universal spectral ranges (4550–4100 cm-1 and 6190–5510 cm-1 for construction of quantitative models of homologous analogs of cephalosporins were proposed by evaluating the performance of five spectral ranges and their combinations, using three data sets of cephalosporins for injection, i.e., cefuroxime sodium, ceftriaxone sodium and cefoperazone sodium. Subsequently, the proposed ranges were validated by using eight calibration sets of other homologous analogs of cephalosporins for injection, namely cefmenoxime hydrochloride, ceftezole sodium, cefmetazole, cefoxitin sodium, cefotaxime sodium, cefradine, cephazolin sodium and ceftizoxime sodium. All the constructed quantitative models for the eight kinds of cephalosporins using these universal ranges could fulfill the requirements for quick quantification. After that, competitive adaptive reweighted sampling (CARS algorithm and infrared (IR–near infrared (NIR two-dimensional (2D correlation spectral analysis were used to determine the scientific basis of these two spectral ranges as the universal regions for the construction of quantitative models of cephalosporins. The CARS algorithm demonstrated that the ranges of 4550–4100 cm-1 and 6190–5510 cm-1 included some key wavenumbers which could be attributed to content changes of cephalosporins. The IR–NIR 2D spectral analysis showed that certain wavenumbers in these two regions have strong correlations to the structures of those cephalosporins that were easy to degrade.

  6. Introducing a model of organizational envy management among university faculty members: A mixed research approach

    Directory of Open Access Journals (Sweden)

    Maris Zarin Daneshvar

    2016-01-01

    Full Text Available The present study aimed at offering a model of organizational envy management among faculty members of Islamic Azad Universities of East Azerbaijan Province. A mixed method through involving qualitative data and then quantitative data emphasizing on quantitative analysis. Population of the study was the entire faculty members with associate or higher degree in the educational year of 2014-2015. In the qualitative stage 20 individuals (experts were selected to design the primary model and questionnaire, and to fit the model 316 faculty members were selected. In the qualitative section it was specified that influential variables on envy management in faculty members are health organizational climate, spiritual leadership, effective communication, job satisfaction and professional development of professors and approved, as well in the quantitative section findings showed that there is a significant relationship between effective variables so that in indirect analysis of effect of organizational climate on envy management, the variable of spiritual leadership via the variable of effective communication had little effect on envy management than variables of professional development and job satisfaction. It is concluded that university managers should provide conditions and backgrounds of envy management in the universities and enable professors for more effective roles without envy in the scientific climate of university to achieve in educational, research and servicing efficiency.

  7. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  8. Developing a Model for Assessing Public Culture Indicators at Universities

    Directory of Open Access Journals (Sweden)

    Meisam Latifi

    2015-06-01

    Full Text Available The present study is aimed to develop a model for assessing public culture at universities and evaluating its indicators at public universities in Mashhad. The research follows an exploratory mixed approach. Research strategies in qualitative and quantitative sections are thematic networks analysis and descriptive- survey method, respectively. In the qualitative section, document analysis and semi-structured interviews with cultural experts are used as research tools. In this section, targeted sampling is carried out. In the quantitative section, a questionnaire which is developed based on the findings of the qualitative section is used as the research tool. Research population of the quantitative section consists of all the students who are admitted to public universities in Mashhad between 2009 and 2012. Sample size was calculated according to Cochran’s formula. Stratified sampling was used to select the sample. The results of the qualitative section led to the identification of 44 basic themes which are referred to as the micro indicators. These themes were clustered into similar groups. Then, 10 organizer themes were identified and recognized as macro indicators. In the next phase, importance factor of each indicator is determined according to the AHP method. The results of the qualitative assessment of indicators at public universities of Mashhad show that the overall cultural index declines during the years the student attends the university. Additionally, the highest correlation exists between national identity and revolutionary identity. The only negative correlations are observed between family and two indicators including social capital and cultural consumption. The results of the present study can be used to assess the state of public culture among university students and also be considered as a basis for assessing cultural planning.

  9. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  10. Rotating universe models

    International Nuclear Information System (INIS)

    Tozini, A.V.

    1984-01-01

    A review is made of some properties of the rotating Universe models. Godel's model is identified as a generalized filted model. Some properties of new solutions of the Einstein's equations, which are rotating non-stationary Universe models, are presented and analyzed. These models have the Godel's model as a particular case. Non-stationary cosmological models are found which are a generalization of the Godel's metrics in an analogous way in which Friedmann is to the Einstein's model. (L.C.) [pt

  11. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  12. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  13. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    Science.gov (United States)

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  14. Sustainable Education: Analyzing the Determinants of University Student Dropout by Nonlinear Panel Data Models

    Directory of Open Access Journals (Sweden)

    Donggeun Kim

    2018-03-01

    Full Text Available University dropout is a serious problem. It affects not only the individual who drops out but also the university and society. However, most previous studies have focused only on the subjective/individual level. University dropout is a very important issue in South Korea, but it has not received much research attention so far. This study examined the possible causes of university dropout in South Korea at the aggregate level, focusing on four fundamental categories: students, resources, faculty, and university characteristics. Three-year balanced panel data from 2013 to 2015 were constructed and estimated by using nonlinear panel data models. The findings show that cost and burden for students, financial resources, qualitative and quantitative features of faculty, and type/size of the university have significant effects on university dropout.

  15. Polymorphic ethyl alcohol as a model system for the quantitative study of glassy behaviour

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H E; Schober, H; Gonzalez, M A [Institut Max von Laue - Paul Langevin (ILL), 38 - Grenoble (France); Bermejo, F J; Fayos, R; Dawidowski, J [Consejo Superior de Investigaciones Cientificas, Madrid (Spain); Ramos, M A; Vieira, S [Universidad Autonoma de Madrid (Spain)

    1997-04-01

    The nearly universal transport and dynamical properties of amorphous materials or glasses are investigated. Reasonably successful phenomenological models have been developed to account for these properties as well as the behaviour near the glass-transition, but quantitative microscopic models have had limited success. One hindrance to these investigations has been the lack of a material which exhibits glass-like properties in more than one phase at a given temperature. This report presents results of neutron-scattering experiments for one such material ordinary ethyl alcohol, which promises to be a model system for future investigations of glassy behaviour. (author). 8 refs.

  16. University staff adoption of iPads: An empirical study using an extended TAM model

    Directory of Open Access Journals (Sweden)

    Michael Steven Lane

    2014-11-01

    Full Text Available This research examined key factors influencing adoption of iPads by university staff. An online survey collected quantitative data to test hypothesised relationships in an extended TAM model. The findings show that university staff consider iPads easy to use and useful, with a high level of compatibility with their work. Social status had no influence on their attitude to using an iPad. However older university staff and university staff with no previous experience in using a similar technology such as an iPhone or smartphone found iPads less easy to use. Furthermore, a lack of formal end user ICT support impacted negatively on the use of iPads.

  17. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  18. The Rise and Fall of Modern Greek in Australia's Universities: What Can a Quantitative Analysis Tell Us?

    Science.gov (United States)

    Hajek, John; Nicholas, Nick

    2004-01-01

    In this article, we look at the state of Modern Greek in Australian universities, focusing on quantitative analysis of its rise and fall in the relatively short period of 35 years since it was first taught as a university subject in Australia. We consider the possible reasons behind this trajectory, in particular correlations with changing…

  19. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    Science.gov (United States)

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (pEEG patterns such as generalized periodic discharges (pEEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Chaotic universe model.

    Science.gov (United States)

    Aydiner, Ekrem

    2018-01-15

    In this study, we consider nonlinear interactions between components such as dark energy, dark matter, matter and radiation in the framework of the Friedman-Robertson-Walker space-time and propose a simple interaction model based on the time evolution of the densities of these components. By using this model we show that these interactions can be given by Lotka-Volterra type equations. We numerically solve these coupling equations and show that interaction dynamics between dark energy-dark matter-matter or dark energy-dark matter-matter-radiation has a strange attractor for 0 > w de  >-1, w dm  ≥ 0, w m  ≥ 0 and w r  ≥ 0 values. These strange attractors with the positive Lyapunov exponent clearly show that chaotic dynamics appears in the time evolution of the densities. These results provide that the time evolution of the universe is chaotic. The present model may have potential to solve some of the cosmological problems such as the singularity, cosmic coincidence, big crunch, big rip, horizon, oscillation, the emergence of the galaxies, matter distribution and large-scale organization of the universe. The model also connects between dynamics of the competing species in biological systems and dynamics of the time evolution of the universe and offers a new perspective and a new different scenario for the universe evolution.

  1. Sustaining Community-University Collaborations: The Durham University Model

    Directory of Open Access Journals (Sweden)

    Andrew Russell

    2011-11-01

    Full Text Available Durham University has initiated a community outreach and engagement program based on an evolving multifaceted model. This article analyses the components of the model and looks at how our work at Durham has become increasingly embedded in the structures and processes of the university as it has developed. The strengths and weaknesses in what has been achieved are highlighted, as is the future vision for the further development of this innovative community-university program. Keywords Public engagement; community partnerships; employer supported volunteering; corporate social responsibility

  2. Generalized PSF modeling for optimized quantitation in PET imaging.

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF

  3. Quantifying Zika: Advancing the Epidemiology of Zika With Quantitative Models.

    Science.gov (United States)

    Keegan, Lindsay T; Lessler, Justin; Johansson, Michael A

    2017-12-16

    When Zika virus (ZIKV) emerged in the Americas, little was known about its biology, pathogenesis, and transmission potential, and the scope of the epidemic was largely hidden, owing to generally mild infections and no established surveillance systems. Surges in congenital defects and Guillain-Barré syndrome alerted the world to the danger of ZIKV. In the context of limited data, quantitative models were critical in reducing uncertainties and guiding the global ZIKV response. Here, we review some of the models used to assess the risk of ZIKV-associated severe outcomes, the potential speed and size of ZIKV epidemics, and the geographic distribution of ZIKV risk. These models provide important insights and highlight significant unresolved questions related to ZIKV and other emerging pathogens. Published by Oxford University Press for the Infectious Diseases Society of America 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  4. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  5. The Toy model: Understanding the early universe

    Science.gov (United States)

    Fisher, Peter H.; Price, Richard H.

    2018-04-01

    In many branches of science, progress is being made by taking advantage of insights from other branches of science. Cosmology, the structure and evolution of the universe, is certainly an area that is currently beset by problems in understanding. We show here that the scientific insights from the studies of early childhood development, in particular, those of Piaget, give a new way of looking at the early universe. This new approach can not only be invaluable in undergraduate teaching, but can even be the basis of semi-quantitative predictions.

  6. Quantitative Literacy at Michigan State University, 3: Designing General Education Mathematics Courses

    Directory of Open Access Journals (Sweden)

    Samuel L. Tunstall

    2016-07-01

    Full Text Available In this paper, we describe the process at Michigan State University whereby we have created two courses, Math 101 and 102, designed to foster numeracy and alleviate mathematics anxiety. The courses--which are not sequential--provide a means of satisfying the University's general education requirement without taking college algebra or calculus, among other options. They are context-driven and broken into modules such as "The World and Its People" and "Health and Risk." They have been highly successful thus far, with students providing positive feedback on their interest in the material and the utility they see of it in their daily lives. We include background on the courses' history, their current status, and present and future challenges, ending with suggestions for others as they attempt to implement quantitative literacy courses at their own institution.

  7. Baryon asymmetry of the Universe in the standard model

    International Nuclear Information System (INIS)

    Farrar, G.R.; Shaposhnikov, M.E.

    1994-01-01

    We study the interactions of quarks and antiquarks with the changing Higgs field during the electroweak phase transition, including quantum mechanical and some thermal effects, with the only source of CP violation being the known CKM phase. We show that the GIM cancellation, which has been commonly thought to imply a prediction which is at least 10 orders of magnitude too small, can be evaded in certain kinematic regimes, for instance, when the strange quark is totally reflected but the down quark is not. We report on a quantitative calculation of the asymmetry in a one-dimensional approximation based on the present understanding of the physics of the high-temperature environment, but with some aspects of the problem oversimplified. The resulting prediction for the magnitude and sign of the present baryonic asymmetry of the Universe agrees with the observed value, with moderately optimistic assumptions about the dynamics of the phase transition. Both magnitude and sign of the asymmetry have an intricate dependence on quark masses and mixings, so that quantitative agreement between prediction and observation would be highly nontrivial. At present uncertainties related to the dynamics of the EW phase transition and the oversimplifications of our treatment are too great to decide whether or not this is the correct explanation for the presence of remnant matter in our Universe; however, the present work makes it clear that the minimal standard model cannot be discounted as a contender for explaining this phenomenon

  8. Emergent universe model with dissipative effects

    Science.gov (United States)

    Debnath, P. S.; Paul, B. C.

    2017-12-01

    Emergent universe model is presented in general theory of relativity with isotropic fluid in addition to viscosity. We obtain cosmological solutions that permit emergent universe scenario in the presence of bulk viscosity that are described by either Eckart theory or Truncated Israel Stewart (TIS) theory. The stability of the solutions are also studied. In this case, the emergent universe (EU) model is analyzed with observational data. In the presence of viscosity, one obtains emergent universe scenario, which however is not permitted in the absence of viscosity. The EU model is compatible with cosmological observations.

  9. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  10. Models of the universe

    International Nuclear Information System (INIS)

    Dirac, P.A.M.

    1981-01-01

    Most models of the universe are dependent on the assumption of a uniform distribution of matter, and thus are rather crude, due to the nonlinear nature of Einstein's field equations. Here, a model is proposed which avoids this smoothing-out process. A metric is obtained which is consistent with the assumption that the matter of the universe is concentrated mainly in stars, moving with the velocity of recession implied by Hubble's law. The solution obtained gives results comparable to those obtainable by Schwarzchild metric, suitably adjusted to agree with the Einstein-DeSitter model at large distances

  11. University Students' Meta-Modelling Knowledge

    Science.gov (United States)

    Krell, Moritz; Krüger, Dirk

    2017-01-01

    Background: As one part of scientific meta-knowledge, students' meta-modelling knowledge should be promoted on different educational levels such as primary school, secondary school and university. This study focuses on the assessment of university students' meta-modelling knowledge using a paper-pencil questionnaire. Purpose: The general purpose…

  12. Hopping models and ac universality

    DEFF Research Database (Denmark)

    Dyre, Jeppe; Schrøder, Thomas

    2002-01-01

    Some general relations for hopping models are established. We proceed to discuss the universality of the ac conductivity which arises in the extreme disorder limit of the random barrier model. It is shown that the relevant dimension entering into the diffusion cluster approximation (DCA) is the h......Some general relations for hopping models are established. We proceed to discuss the universality of the ac conductivity which arises in the extreme disorder limit of the random barrier model. It is shown that the relevant dimension entering into the diffusion cluster approximation (DCA......) is the harmonic (fracton) dimension of the diffusion cluster. The temperature scaling of the dimensionless frequency entering into the DCA is discussed. Finally, some open problems regarding ac universality are listed....

  13. Defect evolution in cosmology and condensed matter quantitative analysis with the velocity-dependent one-scale model

    CERN Document Server

    Martins, C J A P

    2016-01-01

    This book sheds new light on topological defects in widely differing systems, using the Velocity-Dependent One-Scale Model to better understand their evolution. Topological defects – cosmic strings, monopoles, domain walls or others - necessarily form at cosmological (and condensed matter) phase transitions. If they are stable and long-lived they will be fossil relics of higher-energy physics. Understanding their behaviour and consequences is a key part of any serious attempt to understand the universe, and this requires modelling their evolution. The velocity-dependent one-scale model is the only fully quantitative model of defect network evolution, and the canonical model in the field. This book provides a review of the model, explaining its physical content and describing its broad range of applicability.

  14. Universal free school breakfast: a qualitative model for breakfast behaviors

    Directory of Open Access Journals (Sweden)

    Louise eHarvey-Golding

    2015-06-01

    Full Text Available In recent years the provision of school breakfast has increased significantly in the UK. However, research examining the effectiveness of school breakfast is still within relative stages of infancy, and findings to date have been rather mixed. Moreover, previous evaluations of school breakfast schemes have been predominantly quantitative in their methodologies. Presently there are few qualitative studies examining the subjective perceptions and experiences of stakeholders, and thereby an absence of knowledge regarding the sociocultural impacts of school breakfast. The purpose of this study was to investigate the beliefs, views and attitudes, and breakfast consumption behaviors, among key stakeholders, served by a council-wide universal free school breakfast initiative, within the North West of England, UK. A sample of children, parents and school staff were recruited from three primary schools, participating in the universal free school breakfast scheme, to partake in semi-structured interviews and small focus groups. A Grounded Theory analysis of the data collected identified a theoretical model of breakfast behaviors, underpinned by the subjective perceptions and experiences of these key stakeholders. The model comprises of three domains relating to breakfast behaviors, and the internal and external factors that are perceived to influence breakfast behaviors, among children, parents and school staff. Findings were validated using triangulation methods, member checks and inter-rater reliability measures. In presenting this theoretically grounded model for breakfast behaviors, this paper provides a unique qualitative insight into the breakfast consumption behaviors and barriers to breakfast consumption, within a socioeconomically deprived community, participating in a universal free school breakfast intervention program.

  15. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  16. Modeling with Young Students--Quantitative and Qualitative.

    Science.gov (United States)

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  17. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  18. A new economic model for resource industries-implications for universities

    International Nuclear Information System (INIS)

    Romig, P.R.

    1993-01-01

    The upheaval in the US petroleum industry has had repercussions in the university community. Geoscience enrollments have plummeted, financial support has declined, and there are rumors that some programs have reduced mathematical rigor to maintain enrollment. While the adverse affects have been widespread, there is disagreement about implications and expectations for the future. Some argue that emphasis on short-term profitability produces ill-conceived, precipitous reactions which perpetuate the turmoil. Others respond that the resource and environmental needs of a burgeoning global population will ensure long-term growth. Both arguments miss the point. The fundamental economic structure of the industry is changing from revenue-driven to marginal-return. In marginal-return industries, investments depend on quantitative assessments of risk and return, and the use of interdisciplinary teams is the norm. University programs must educate students in engineering design and structured decision-making processes, develop integrated numeric models and create infrastructures that support multidisciplinary collaboration. Educational programs must begin teaching principles of engineering design and structured decision-making, with increased emphasis on outreach to the experienced employee. Meeting those needs will require closer collaboration between industry and the universities. Universities that are successful will reap a fringe benefit; their graduate will be better-qualified to be leaders in the environmentally geoscience field, which one day may be bigger than the oil industry

  19. Black Hole Universe Model and Dark Energy

    Science.gov (United States)

    Zhang, Tianxi

    2011-01-01

    Considering black hole as spacetime and slightly modifying the big bang theory, the author has recently developed a new cosmological model called black hole universe, which is consistent with Mach principle and Einsteinian general relativity and self consistently explains various observations of the universe without difficulties. According to this model, the universe originated from a hot star-like black hole and gradually grew through a supermassive black hole to the present universe by accreting ambient material and merging with other black holes. The entire space is infinitely and hierarchically layered and evolves iteratively. The innermost three layers are the universe that we lives, the outside space called mother universe, and the inside star-like and supermassive black holes called child universes. The outermost layer has an infinite radius and zero limits for both the mass density and absolute temperature. All layers or universes are governed by the same physics, the Einstein general relativity with the Robertson-Walker metric of spacetime, and tend to expand outward physically. When one universe expands out, a new similar universe grows up from its inside black holes. The origin, structure, evolution, expansion, and cosmic microwave background radiation of black hole universe have been presented in the recent sequence of American Astronomical Society (AAS) meetings and published in peer-review journals. This study will show how this new model explains the acceleration of the universe and why dark energy is not required. We will also compare the black hole universe model with the big bang cosmology.

  20. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  1. Universally sloppy parameter sensitivities in systems biology models.

    Directory of Open Access Journals (Sweden)

    Ryan N Gutenkunst

    2007-10-01

    Full Text Available Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  2. Universally sloppy parameter sensitivities in systems biology models.

    Science.gov (United States)

    Gutenkunst, Ryan N; Waterfall, Joshua J; Casey, Fergal P; Brown, Kevin S; Myers, Christopher R; Sethna, James P

    2007-10-01

    Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  3. A Quantitative Optimization Framework for Market-Driven Academic Program Portfolios

    NARCIS (Netherlands)

    Burgher, Joshua; Hamers, Herbert

    2017-01-01

    We introduce a quantitative model that can be used for decision support for planning and optimizing the composition of portfolios of market-driven academic programs within the context of higher education. This model is intended to enable leaders in colleges and universities to maximize financial

  4. Transition from AdS universe to DS universe in the BPP model

    International Nuclear Information System (INIS)

    Kim, Wontae; Yoon, Myungseok

    2007-01-01

    It can be shown that in the BPP model the smooth phase transition from the asymptotically decelerated AdS universe to the asymptotically accelerated DS universe is possible by solving the modified semiclassical equations of motion. This transition comes from noncommutative Poisson algebra, which gives the constant curvature scalars asymptotically. The decelerated expansion of the early universe is due to the negative energy density with the negative pressure induced by quantum back reaction, and the accelerated late-time universe comes from the positive energy and the negative pressure which behave like dark energy source in recent cosmological models

  5. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels

    2013-01-01

    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  6. The thermal evolution of universe: standard model

    International Nuclear Information System (INIS)

    Nascimento, L.C.S. do.

    1975-08-01

    A description of the dynamical evolution of the Universe following a model based on the theory of General Relativity is made. The model admits the Cosmological principle,the principle of Equivalence and the Robertson-Walker metric (of which an original derivation is presented). In this model, the universe is considered as a perfect fluid, ideal and symmetric relatively to the number of particles and antiparticles. The thermodynamic relations deriving from these hypothesis are derived, and from them the several eras of the thermal evolution of the universe are established. Finally, the problems arising from certain specific predictions of the model are studied, and the predictions of the abundances of the elements according to nucleosynthesis and the actual behavior of the universe are analysed in detail. (author) [pt

  7. Reuleaux models at St. Petersburg State University

    Science.gov (United States)

    Kuteeva, G. A.; Sinilshchikova, G. A.; Trifonenko, B. V.

    2018-05-01

    Franz Reuleaux (1829 - 1905) is a famous mechanical engineer, a Professor of the Berlin Royal Technical Academy. He became widely known as an engineer-scientist, a Professor and industrial consultant, education reformer and leader of the technical elite of Germany. He directed the design and manufacture of over 300 models of simple mechanisms. They were sold to many famous universities for pedagogical and scientific purposes. Today, the most complete set is at Cornell University, College of Engineering. In this article we discuss the history, the modern state and our using the Reuleaux models that survived at St. Petersburg State University for educational purposes. We present description of certain models and our electronic resource with these models. We provide the information of similar electronic resources from other universities.

  8. Is the island universe model consistent with observations?

    OpenAIRE

    Piao, Yun-Song

    2005-01-01

    We study the island universe model, in which initially the universe is in a cosmological constant sea, then the local quantum fluctuations violating the null energy condition create the islands of matter, some of which might corresponds to our observable universe. We examine the possibility that the island universe model is regarded as an alternative scenario of the origin of observable universe.

  9. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  10. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  11. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  12. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  13. Model of the static universe within GR

    International Nuclear Information System (INIS)

    Karbanovski, V. V.; Tarasova, A. S.; Salimova, A. S.; Bilinskaya, G. V.; Sumbulov, A. N.

    2011-01-01

    Within GR, the problems of the Robertson-Walker universe are discussed. The approach based on transition to a nondiagonal line element is suggested. Within the considered approach, the static universe model is investigated. The possibility of constructing scenarios without an initial singularity and “exotic” matter is discussed. Accordance of the given model to the properties of the observable universe is discussed.

  14. A Review of Quantitative Situation Assessment Models for Nuclear Power Plant Operators

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Situation assessment is the process of developing situation awareness and situation awareness is defined as 'the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.' Situation awareness is an important element influencing human actions because human decision making is based on the result of situation assessment or situation awareness. There are many models for situation awareness and those models can be categorized into qualitative or quantitative. As the effects of some input factors on situation awareness can be investigated through the quantitative models, the quantitative models are more useful for the design of operator interfaces, automation strategies, training program, and so on, than the qualitative models. This study presents the review of two quantitative models of situation assessment (SA) for nuclear power plant operators

  15. A universe model confronted to observations

    International Nuclear Information System (INIS)

    Souriau, J.M.

    1982-09-01

    Present work is a detailed study of a Universe model elaborated in several steps, and some of its consequences. Absence zone in quasar spatial distribution is first described; demonstration is made it is sufficient to determine a cosmological model. Each following paragraph is concerned with a type of observation, which is confronted with the model. Universe age and density, redshift-luminosity relation for galaxies and quasars, diameter-redshift relation for radiosources, radiation isotropy at 3 0 K, matter-antimatter contact zone physics. An eventual stratification of universe parallel to this zone is more peculiarly studied; absorption lines in quasar spectra are in way interpreted, just as local super-cluster and local group of galaxies, galaxy HI region orientation, and at last neighbouring galaxy kinematics [fr

  16. University Start-ups: A Better Business Model

    Science.gov (United States)

    Dehn, J.; Webley, P. W.

    2015-12-01

    Many universities look to start-up companies as a way to attract faculty, supporting research and students as traditional federal sources become harder to come by. University affiliated start-up companies can apply for a broader suite of grants, as well as market their services to a broad customer base. Often university administrators see this as a potential panacea, but national statistics show this is not the case. Rarely do universities profit significantly from their start-ups. With a success rates of around 20%, most start-ups end up costing the university money as well as faculty-time. For the faculty, assuming they want to continue in academia, a start-up is often unattractive because it commonly leads out of academia. Running a successful business as well as maintaining a strong teaching and research load is almost impossible to do at the same time. Most business models and business professionals work outside of academia, and the models taught in business schools do not merge well in a university environment. To mitigate this a new business model is proposed where university start-ups are aligned with the academic and research missions of the university. A university start-up must work within the university, directly support research and students, and the work done maintaining the business be recognized as part of the faculty member's university obligations. This requires a complex conflict of interest management plan and for the companies to be non-profit in order to not jeopardize the university's status. This approach may not work well for all universities, but would be ideal for many to conserve resources and ensure a harmonious relationship with their start-ups and faculty.

  17. A New Cosmological Model: Black Hole Universe

    Directory of Open Access Journals (Sweden)

    Zhang T. X.

    2009-07-01

    Full Text Available A new cosmological model called black hole universe is proposed. According to this model, the universe originated from a hot star-like black hole with several solar masses, and gradually grew up through a supermassive black hole with billion solar masses to the present state with hundred billion-trillion solar masses by accreting ambient mate- rials and merging with other black holes. The entire space is structured with infinite layers hierarchically. The innermost three layers are the universe that we are living, the outside called mother universe, and the inside star-like and supermassive black holes called child universes. The outermost layer is infinite in radius and limits to zero for both the mass density and absolute temperature. The relationships among all layers or universes can be connected by the universe family tree. Mathematically, the entire space can be represented as a set of all universes. A black hole universe is a subset of the en- tire space or a subspace. The child universes are null sets or empty spaces. All layers or universes are governed by the same physics - the Einstein general theory of relativity with the Robertson-walker metric of spacetime - and tend to expand outward physically. The evolution of the space structure is iterative. When one universe expands out, a new similar universe grows up from its inside. The entire life of a universe begins from the birth as a hot star-like or supermassive black hole, passes through the growth and cools down, and expands to the death with infinite large and zero mass density and absolute temperature. The black hole universe model is consistent with the Mach principle, the observations of the universe, and the Einstein general theory of relativity. Its various aspects can be understood with the well-developed physics without any difficulty. The dark energy is not required for the universe to accelerate its expansion. The inflation is not necessary because the black hole universe

  18. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  19. Generation of a bubbly universe - a quantitative assessment of the CfA slice

    International Nuclear Information System (INIS)

    Ostriker, J.P.; Strassler, M.J.

    1989-01-01

    A first attempt is made to calculate the properties of the matter distribution in a universe filled with overlapping bubbles produced by multiple explosions. Each spherical shell follows the cosmological Sedov-Taylor solution until it encounters another shell. Thereafter, mergers are allowed to occur in pairs on the basis of N-body results. At the final epoch, the matrix of overlapping shells is populated with 'galaxies' and the properties of slices through the numerically constructed cube compare well with CfA survey results for specified initial conditions. A statistic is found which measures the distance distribution from uniformly distributed points to the nearest galaxies on the projected plane which appears to provide a good measure of the bubbly character of the galaxy distribution. In a quantitative analysis of the CfA 'slice of the universe', a very good match is found between simulation and the real data for final average bubble radii of (13.5 + or - 1.5)/h Mpc with formal filling factor 1.0-1.5 or actual filling factor of 65-80 percent. 25 references

  20. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  1. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  2. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  3. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    Energy Technology Data Exchange (ETDEWEB)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  4. University Administration on a Political Model.

    Science.gov (United States)

    Walker, Donald E.

    1979-01-01

    It is suggested that recognizing the university as a political community may lead to better management and organization. The patriarchal role, the president as hero, dispersed power, how the university really functions, and a political model are described. (MLW)

  5. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  6. Virtual Universities: Current Models and Future Trends.

    Science.gov (United States)

    Guri-Rosenblit, Sarah

    2001-01-01

    Describes current models of distance education (single-mode distance teaching universities, dual- and mixed-mode universities, extension services, consortia-type ventures, and new technology-based universities), including their merits and problems. Discusses future trends in potential student constituencies, faculty roles, forms of knowledge…

  7. Southwest University's No-Fee Teacher-Training Model

    Science.gov (United States)

    Chen, Shijian; Yang, Shuhan; Li, Linyuan

    2013-01-01

    The training model for Southwest University's no-fee teacher education program has taken shape over several years. Based on a review of the documentation and interviews with administrators and no-fee preservice students from different specialties, this article analyzes Southwest University's no-fee teacher-training model in terms of three main…

  8. Modeling Factors with Influence on Sustainable University Management

    Directory of Open Access Journals (Sweden)

    Oana Dumitrascu

    2015-01-01

    Full Text Available The main objective of this paper is to present the factors with influence on the sustainable university management and the relationships between them. In the scientific approach we begin from a graphical model, according to which the extracurricular activities together with internal environmental factors influence students’ involvement in such activities, the university attractiveness, their academic performance and their integration into the socially-economic and natural environment (components related with sustainable development. The model emphasizes that individual performances, related to students’ participation in extracurricular activities, have a positive influence on the sustainability of university management. The results of the study have shown that the university sustainability may be influenced by a number of factors, such as students’ performance, students’ involvement in extracurricular activities or university’s attractiveness and can in turn influence implicitly also the sustainability of university management. The originality of the paper consists in the relationships study using the modeling method in general and informatics tools of modeling in particular, as well as through graphical visualization of some influences, on the sustainability university management.

  9. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. The Effect of Entrepreneurship Education on Entrepreneurial Intention of University Students By Adopting Linan Model

    Directory of Open Access Journals (Sweden)

    Yud Buana

    2017-05-01

    Full Text Available The success of entrepreneurship education programs remains unanswered if it is associated with some students who have decided to launch and pursue a business venture. It is important to know the intentions of a nascent entrepreneur to start up the business ventures persistently if experts and policy makers’ attentions are drawn on how to arouse interest in starting a business. Quantitative approached was used in this research to examine the influence of entrepreneurship education, social norms and self-efficacy on intentions to pursue business ventures by adopting Linan model of intention-behavior. The model was addressed to the students who participated in entrepreneurship education program during the mid of study in Bina Nusantara University. Last, the result is in line with Linan model.

  11. [Quantitative estimation of vegetation cover and management factor in USLE and RUSLE models by using remote sensing data: a review].

    Science.gov (United States)

    Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie

    2012-06-01

    Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.

  12. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    Science.gov (United States)

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Evaluation of recent quantitative magnetospheric magnetic field models

    International Nuclear Information System (INIS)

    Walker, R.J.

    1976-01-01

    Recent quantitative magnetospheric field models contain many features not found in earlier models. Magnetopause models which include the effects of the dipole tilt were presented. More realistic models of the tail field include tail currents which close on the magnetopause, cross-tail currents of finite thickness, and cross-tail current models which model the position of the neutral sheet as a function of tilt. Finally, models have attempted to calculate the field of currents distributed in the inner magnetosphere. As the purpose of a magnetospheric model is to provide a mathematical description of the field that reasonably reproduces the observed magnetospheric field, several recent models were compared with the observed ΔB(B/sub observed/--B/sub main field/) contours. Models containing only contributions from magnetopause and tail current systems are able to reproduce the observed quiet time field only in an extremely qualitative way. The best quantitative agreement between models and observations occurs when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. However, the distributed current models are valid only for zero tilt. Even the models which reproduce the average observed field reasonably well may not give physically reasonable field gradients. Three of the models evaluated contain regions in the near tail in which the field gradient reverses direction. One region in which all the models fall short is that around the polar cusp, though most can be used to calculate the position of the last closed field line reasonably well

  14. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting pa...

  15. The Triad Research University or a Post 20th Century Research University Model

    Science.gov (United States)

    Tadmor, Zehev

    2006-01-01

    In this paper, a model for the future research university is proposed, which answers some of the key challenges facing universities. It consists of three independent yet closely knitted entities: a research institute, a university teaching college and a business unit creating a "triad" structure. The possible inevitability, the advantages and…

  16. A Global Change in Higher Education: Entrepreneurial University Model

    Directory of Open Access Journals (Sweden)

    Süreyya SAKINÇ

    2012-01-01

    Full Text Available Universities are affected by the social and economic diversity stemmed from globalization and internationalization, and its functions, area of responsibility, organizational structure, funding capability respond this diversity. In today's knowledge society, different new concepts regarding the university education system such as Entrepreneur University, Corporation University, virtual university etc. have been emerged with wave of globalization effect. The rising competition in academic education and the mass demands for education prompt to universities to get seeking new funds for fixing their financial situation, and hit them transforming into entrepreneurial identity. The reflections of neoliberal approach in education have transformed the universities into the corporations which are much more focused on entrepreneurial, student-oriented and aimed to appropriate education and producing creative human resources for global development. In this study, a comprehensive evaluation will be carried on regarding the entrepreneur university model through the litterateur research to investigate its causes and factors that impact and improve it. The aim of the paper is to generate a framework that identifies dynamic processes of entrepreneur university model, dependently the litterateur syntheses. The contribution of the paper will depend on its consequent argument that entrepreneur university model is viable for Turkey. In this paper, the entrepreneur university model will be analyzed by Triple Helix phenomenon with the comparative approach.

  17. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  18. The Loyalty Model of Private University Student

    Directory of Open Access Journals (Sweden)

    Leonnard

    2014-04-01

    Full Text Available This study investigates Loyalty Model of Private University Student by using STIKOM London School of Public Relation as a study case. This study examined the model from service quality, college image, price, trust and satisfaction perspective. Thus, the objective of this study is to examine and analyze the effect of service quality, college image, tuition fee, trust and satisfaction towards students’ loyalty; the effect of service quality, college image, price and satisfaction towards trust; and the effect of service quality, college image and price towards satisfaction. This study used survey methodology with causal design. The samples of the study are 320 college students. The gathering of data is conducted by using questionnaire in likert scale. The analysis of the data used a Structural Equation Model (SEM approach. The implication of this study is portraying a full contextual description of loyalty model in private university by giving an integrated and innovated contribution to Student Loyalty Model in private university.

  19. The Loyalty Model of Private University Student

    Directory of Open Access Journals (Sweden)

    Leonnard

    2014-04-01

    Full Text Available This study investigates Loyalty Model of Private University Student by using STIKOM London School of Public Relation as a study case. This study examined the model from service quality, college image, price, trust and satisfaction perspective. Thus, the objective of this study is to examine and analyze the effect of service quality, college image, tuition fee, trust and satisfaction towards students’ loyalty; the effect of service quality, college image, price and satisfaction towards trust; and the effect of service quality, college image and price towards satisfaction. This study used survey methodology with causal design. The samples of the study are 320 college students. The gathering of data is conducted by using questionnaire in likert scale. The analysis of the data used a Structural Equation Model (SEM approach. The implication of this study is portraying a full contextual description of loyalty model in private university by giving an integrated and innovated contribution to Student Loyalty Model in private university..

  20. Creating a Universe, a Conceptual Model

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2016-10-01

    Full Text Available Space is something. Space inherently contains laws of nature: universal rules (mathematics, space dimensions, types of forces, types of fields, and particle species, laws (relativity, quantum mechanics, thermodynamics, and electromagnetism and symmetries (Lorentz, Gauge, and symmetry breaking. We have significant knowledge about these laws of nature because all our scientific theories assume their presence. Their existence is critical for developing either a unique theory of our universe or more speculative multiverse theories. Scientists generally ignore the laws of nature because they “are what they are” and because visualizing different laws of nature challenges the imagination. This article defines a conceptual model separating space (laws of nature from the universe’s energy source (initial conditions and expansion (big bang. By considering the ramifications of changing the laws of nature, initial condition parameters, and two variables in the big bang theory, the model demonstrates that traditional fine tuning is not the whole story when creating a universe. Supporting the model, space and “nothing” are related to the laws of nature, mathematics and multiverse possibilities. Speculation on the beginning of time completes the model.

  1. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  2. A Review on Quantitative Models for Sustainable Food Logistics Management

    Directory of Open Access Journals (Sweden)

    M. Soysal

    2012-12-01

    Full Text Available The last two decades food logistics systems have seen the transition from a focus on traditional supply chain management to food supply chain management, and successively, to sustainable food supply chain management. The main aim of this study is to identify key logistical aims in these three phases and analyse currently available quantitative models to point out modelling challenges in sustainable food logistics management (SFLM. A literature review on quantitative studies is conducted and also qualitative studies are consulted to understand the key logistical aims more clearly and to identify relevant system scope issues. Results show that research on SFLM has been progressively developing according to the needs of the food industry. However, the intrinsic characteristics of food products and processes have not yet been handled properly in the identified studies. The majority of the works reviewed have not contemplated on sustainability problems, apart from a few recent studies. Therefore, the study concludes that new and advanced quantitative models are needed that take specific SFLM requirements from practice into consideration to support business decisions and capture food supply chain dynamics.

  3. A quantitative phase field model for hydride precipitation in zirconium alloys: Part I. Development of quantitative free energy functional

    International Nuclear Information System (INIS)

    Shi, San-Qiang; Xiao, Zhihua

    2015-01-01

    A temperature dependent, quantitative free energy functional was developed for the modeling of hydride precipitation in zirconium alloys within a phase field scheme. The model takes into account crystallographic variants of hydrides, interfacial energy between hydride and matrix, interfacial energy between hydrides, elastoplastic hydride precipitation and interaction with externally applied stress. The model is fully quantitative in real time and real length scale, and simulation results were compared with limited experimental data available in the literature with a reasonable agreement. The work calls for experimental and/or theoretical investigations of some of the key material properties that are not yet available in the literature

  4. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  5. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  6. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  7. Performance Theories for Sentence Coding: Some Quantitative Models

    Science.gov (United States)

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  8. Universe in the theoretical model «Evolving matter»

    Directory of Open Access Journals (Sweden)

    Bazaluk Oleg

    2013-04-01

    Full Text Available The article critically examines modern model of the Universe evolution constructed by efforts of a group of scientists (mathematicians, physicists and cosmologists from the world's leading universities (Oxford and Cambridge Universities, Yale, Columbia, New York, Rutgers and the UC Santa Cruz. The author notes its strengths, but also points to shortcomings. Author believes that this model does not take into account the most important achievements in the field of biochemistry and biology (molecular, physical, developmental, etc., as well as neuroscience and psychology. Author believes that in the construction of model of the Universe evolution, scientists must take into account (with great reservations the impact of living and intelligent matter on space processes. As an example, the author gives his theoretical model "Evolving matter". In this model, he shows not only the general dependence of the interaction of cosmic processes with inert, living and intelligent matter, but also he attempts to show the direct influence of systems of living and intelligent matter on the acceleration of the Universe's expansion.

  9. The Arizona Universities Library Consortium patron-driven e-book model

    Directory of Open Access Journals (Sweden)

    Jeanne Richardson

    2013-03-01

    Full Text Available Building on Arizona State University's patron-driven acquisitions (PDA initiative in 2009, the Arizona Universities Library Consortium, in partnership with the Ingram Content Group, created a cooperative patron-driven model to acquire electronic books (e-books. The model provides the opportunity for faculty and students at the universities governed by the Arizona Board of Regents (ABOR to access a core of e-books made accessible through resource discovery services and online catalogs. These books are available for significantly less than a single ABOR university would expend for the same materials. The patron-driven model described is one of many evolving models in digital scholarship, and, although the Arizona Universities Library Consortium reports a successful experience, patron-driven models pose questions to stakeholders in the academic publishing industry.

  10. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  11. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  12. On the generation of a bubbly universe - A quantitative assessment of the CfA slice

    Science.gov (United States)

    Ostriker, J. P.; Strassler, M. J.

    1989-01-01

    A first attempt is made to calculate the properties of the matter distribution in a universe filled with overlapping bubbles produced by multiple explosions. Each spherical shell follows the cosmological Sedov-Taylor solution until it encounters another shell. Thereafter, mergers are allowed to occur in pairs on the basis of N-body results. At the final epoch, the matrix of overlapping shells is populated with 'galaxies' and the properties of slices through the numerically constructed cube compare well with CfA survey results for specified initial conditions. A statistic is found which measures the distance distribution from uniformly distributed points to the nearest galaxies on the projected plane which appears to provide a good measure of the bubbly character of the galaxy distribution. In a quantitative analysis of the CfA 'slice of the universe', a very good match is found between simulation and the real data for final average bubble radii of (13.5 + or - 1.5)/h Mpc with formal filling factor 1.0-1.5 or actual filling factor of 65-80 percent.

  13. Interacting agegraphic dark energy models in non-flat universe

    International Nuclear Information System (INIS)

    Sheykhi, Ahmad

    2009-01-01

    A so-called 'agegraphic dark energy' was recently proposed to explain the dark energy-dominated universe. In this Letter, we generalize the agegraphic dark energy models to the universe with spatial curvature in the presence of interaction between dark matter and dark energy. We show that these models can accommodate w D =-1 crossing for the equation of state of dark energy. In the limiting case of a flat universe, i.e. k=0, all previous results of agegraphic dark energy in flat universe are restored.

  14. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  15. Quantitative metal magnetic memory reliability modeling for welded joints

    Science.gov (United States)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  16. Quantitative chemogenomics: machine-learning models of protein-ligand interaction.

    Science.gov (United States)

    Andersson, Claes R; Gustafsson, Mats G; Strömbergsson, Helena

    2011-01-01

    Chemogenomics is an emerging interdisciplinary field that lies in the interface of biology, chemistry, and informatics. Most of the currently used drugs are small molecules that interact with proteins. Understanding protein-ligand interaction is therefore central to drug discovery and design. In the subfield of chemogenomics known as proteochemometrics, protein-ligand-interaction models are induced from data matrices that consist of both protein and ligand information along with some experimentally measured variable. The two general aims of this quantitative multi-structure-property-relationship modeling (QMSPR) approach are to exploit sparse/incomplete information sources and to obtain more general models covering larger parts of the protein-ligand space, than traditional approaches that focuses mainly on specific targets or ligands. The data matrices, usually obtained from multiple sparse/incomplete sources, typically contain series of proteins and ligands together with quantitative information about their interactions. A useful model should ideally be easy to interpret and generalize well to new unseen protein-ligand combinations. Resolving this requires sophisticated machine-learning methods for model induction, combined with adequate validation. This review is intended to provide a guide to methods and data sources suitable for this kind of protein-ligand-interaction modeling. An overview of the modeling process is presented including data collection, protein and ligand descriptor computation, data preprocessing, machine-learning-model induction and validation. Concerns and issues specific for each step in this kind of data-driven modeling will be discussed. © 2011 Bentham Science Publishers

  17. A model for the development of university curricula in nanoelectronics

    DEFF Research Database (Denmark)

    Bruun, Erik; Nielsen, I

    2010-01-01

    Nanotechnology is having an increasing impact on university curricula in electrical engineering and in physics. Major influencers affecting developments in university programmes related to nanoelectronics are discussed and a model for university programme development is described. The model takes...... engineering. Examples of European curricula following this framework are identified and described. These examples may serve as sources of inspiration for future developments and the model...

  18. A Model for the Development of University Curricula in Nanoelectronics

    Science.gov (United States)

    Bruun, E.; Nielsen, I.

    2010-01-01

    Nanotechnology is having an increasing impact on university curricula in electrical engineering and in physics. Major influencers affecting developments in university programmes related to nanoelectronics are discussed and a model for university programme development is described. The model takes into account that nanotechnology affects not only…

  19. Modelling of web-based virtual university administration for Nigerian ...

    African Journals Online (AJOL)

    This research work focused on development of a model of web based virtual University Administration for Nigerian universities. This is necessary as there is still a noticeable administrative constraint in our Universities, the establishment of many University Web portals notwithstanding. More efforts are therefore needed to ...

  20. The universal function in color dipole model

    Science.gov (United States)

    Jalilian, Z.; Boroun, G. R.

    2017-10-01

    In this work we review color dipole model and recall properties of the saturation and geometrical scaling in this model. Our primary aim is determining the exact universal function in terms of the introduced scaling variable in different distance than the saturation radius. With inserting the mass in calculation we compute numerically the contribution of heavy productions in small x from the total structure function by the fraction of universal functions and show the geometrical scaling is established due to our scaling variable in this study.

  1. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  2. On the usability of quantitative modelling in operations strategy decission making

    NARCIS (Netherlands)

    Akkermans, H.A.; Bertrand, J.W.M.

    1997-01-01

    Quantitative modelling seems admirably suited to help managers in their strategic decision making on operations management issues, but in practice models are rarely used for this purpose. Investigates the reasons why, based on a detailed cross-case analysis of six cases of modelling-supported

  3. Universality of projectile fragmentation model

    International Nuclear Information System (INIS)

    Chaudhuri, G.; Mallik, S.; Das Gupta, S.

    2012-01-01

    Presently projectile fragmentation reaction is an important area of research as it is used for the production of radioactive ion beams. In this work, the recently developed projectile fragmentation model with an universal temperature profile is used for studying the charge distributions of different projectile fragmentation reactions with different projectile target combinations at different incident energies. The model for projectile fragmentation consists of three stages: (i) abrasion, (ii) multifragmentation and (iii) evaporation

  4. Universal correlators for multi-arc complex matrix models

    International Nuclear Information System (INIS)

    Akemann, G.

    1997-01-01

    The correlation functions of the multi-arc complex matrix model are shown to be universal for any finite number of arcs. The universality classes are characterized by the support of the eigenvalue density and are conjectured to fall into the same classes as the ones recently found for the Hermitian model. This is explicitly shown to be true for the case of two arcs, apart from the known result for one arc. The basic tool is the iterative solution of the loop equation for the complex matrix model with multiple arcs, which provides all multi-loop correlators up to an arbitrary genus. Explicit results for genus one are given for any number of arcs. The two-arc solution is investigated in detail, including the double-scaling limit. In addition universal expressions for the string susceptibility are given for both the complex and Hermitian model. (orig.)

  5. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  6. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  7. Quantitative Systems Pharmacology: A Case for Disease Models.

    Science.gov (United States)

    Musante, C J; Ramanujan, S; Schmidt, B J; Ghobrial, O G; Lu, J; Heatherington, A C

    2017-01-01

    Quantitative systems pharmacology (QSP) has emerged as an innovative approach in model-informed drug discovery and development, supporting program decisions from exploratory research through late-stage clinical trials. In this commentary, we discuss the unique value of disease-scale "platform" QSP models that are amenable to reuse and repurposing to support diverse clinical decisions in ways distinct from other pharmacometrics strategies. © 2016 The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of The American Society for Clinical Pharmacology and Therapeutics.

  8. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  9. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  10. QUANTITATIVE ESTIMATION OF SOIL EROSION IN THE DRĂGAN RIVER WATERSHED WITH THE U.S.L.E. TYPE ROMSEM MODEL

    Directory of Open Access Journals (Sweden)

    Csaba HORVÁTH

    2008-05-01

    Full Text Available Quantitative estimation of soil erosion in the Drăgan river watershed with the U.S.L.E. type Romsem modelSediment delivered from water erosion causes substantial waterway damages and water quality degradation. A number of factors such as drainage area size, basin slope, climate, land use/land cover may affect sediment delivery processes. The goal of this study is to define a computationally effective suitable soil erosion model in the Drăgan river watershed, for future sedimentation studies. Geographic Information System (GIS is used to determine the Universal Soil Loss Equation Model (U.S.L.E. values of the studied water basin. The methods and approaches used in this study are expected to be applicable in future research and to watersheds in other regions.

  11. University-Industry Research Collaboration: A Model to Assess University Capability

    Science.gov (United States)

    Abramo, Giovanni; D'Angelo, Ciriaco Andrea; Di Costa, Flavia

    2011-01-01

    Scholars and policy makers recognize that collaboration between industry and the public research institutions is a necessity for innovation and national economic development. This work presents an econometric model which expresses the university capability for collaboration with industry as a function of size, location and research quality. The…

  12. Faculties of Education in Traditional Universities and Universities of the Third Age: A Partnership Model in Gerontagogy

    Science.gov (United States)

    Lemieux, Andre; Boutin, Gerald; Riendeau, Jean

    2007-01-01

    This article discusses "Universities of the Third Age", whose function is quite distinct from established universities' traditional role in teaching, research, and community services. Consequently, there is an urgent need to develop a model of partnership between traditional universities and Universities of the Third Age, ensuring better…

  13. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  14. THE MODEL OF UNIVERSAL BANKING SUPERMARKET IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Tatiana Manolievna GORDITSA

    2017-06-01

    Full Text Available The article deals with the author's conceptual approach to the multiple scientific concepts of both traditional and universal banking service moreover it shows the level of transformation of the latter to the model of the finance supermarket – the top of the modern retail banking, a structure that was formed due to globalization of the finance-credit industry. The article analyses the category of “finance supermarket” and brings out a common idea considering the main features of the mentioned organization model of banking service. The main features include: 1. Complex banking service satisfying the customers` needs; 2. The Bundling of banking and financial products (services; 3. Product line extension, standardization and large scale character of sale; 4. Remote banking. Bundling of the products (services introduced in this model allows the maximal integration of the finance services, operations and products including banking, consulting, insurance, investment services at the same office. Analysis of the scientific literature shows that the organization structure of the servicing in a Ukrainian universal bank mostly associates the model of a finance supermarket. However, current restrictions of the Ukrainian legal system and the existence of the certain transition level, caused by gradual application of the innovations of both financial and technological origin (evolutionary-innovative development are not taken into account. Looking from this angle, the author describes a transition model – from a universal bank to a financial supermarket, a universal banking supermarket. The model`s distinctive feature is the application of the improved technological service, that induced the transformation of modern banking operations, services and products in Ukraine from simplest to complex.

  15. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...

  16. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Science.gov (United States)

    2012-07-17

    ... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...] Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A... Influenza Disease Models to Quantitatively Evaluate the Benefits and Risks of Vaccines: A Technical Workshop...

  17. Measuring organizational learning. Model testing in two Romanian universities

    OpenAIRE

    Alexandra Luciana Guţă

    2014-01-01

    The scientific literature associates organizational learning with superior organization performance. If we refer to the academic environment, we appreciate that it can develop and reach better levels of performance through changes driven from the inside. Thus, through this paper we elaborate on a conceptual model of organizational learning and we test the model on a sample of employees (university teachers and researchers) from two Romanian universities. The model comprises the process of org...

  18. DEVELOPING A SEVEN METAPHORS MODEL OF MARKETING FOR UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    COITA Dorin-Cristian

    2014-12-01

    Full Text Available The concept of marketing applied in education offers a lot of possibilities of social innovation. It is a tool helping educational organization to acquire resources and to provide value. In this article presented a model of seven metaphors to be used by a universities in order to acquire resources and to provide value to its stakeholders and applied it in the case of a Romanian university called The University. The aim of the paper is to identify sources of social innovations by using this model in the field of educational marketing.

  19. Our universe as an attractor in a superstring model

    International Nuclear Information System (INIS)

    Maeda, Keiichi.

    1986-11-01

    One preferential scenario of the evolution of the universe is discussed in a superstring model. The universe can reach the present state as an attractor in the dynamical system. The kinetic terms of the ''axions'' play an important role so that our present universe is realized almost uniquely. (author)

  20. Carolina Care at University of North Carolina Health Care: Implementing a Theory-Driven Care Delivery Model Across a Healthcare System.

    Science.gov (United States)

    Tonges, Mary; Ray, Joel D; Herman, Suzanne; McCann, Meghan

    2018-04-01

    Patient satisfaction is a key component of healthcare organizations' performance. Providing a consistent, positive patient experience across a system can be challenging. This article describes an organization's approach to achieving this goal by implementing a successful model developed at the flagship academic healthcare center across an 8-hospital system. The Carolina Care at University of North Carolina Health Care initiative has resulted in substantive qualitative and quantitative benefits including higher patient experience scores for both overall rating and nurse communication.

  1. A universal model of giftedness - adaptation of the Munich Model

    NARCIS (Netherlands)

    Jessurun, J.H.; Shearer, C.B.; Weggeman, M.C.D.P.

    2016-01-01

    The Munich Model of Giftedness (MMG) by Heller and his colleagues, developed for the identification of gifted children, is adapted and expanded, with the aim of making it more universally usable as a model for the pathway from talents to performance. On the side of the talent-factors, the concept of

  2. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  3. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  4. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  5. Quantitative model of New Zealand's energy supply industry

    Energy Technology Data Exchange (ETDEWEB)

    Smith, B. R. [Victoria Univ., Wellington, (New Zealand); Lucas, P. D. [Ministry of Energy Resources (New Zealand)

    1977-10-15

    A mathematical model is presented to assist in an analysis of energy policy options available. The model is based on an engineering orientated description of New Zealand's energy supply and distribution system. The system is cast as a linear program, in which energy demand is satisfied at least cost. The capacities and operating modes of process plant (such as power stations, oil refinery units, and LP-gas extraction plants) are determined by the model, as well as the optimal mix of fuels supplied to the final consumers. Policy analysis with the model enables a wide ranging assessment of the alternatives and uncertainties within a consistent quantitative framework. It is intended that the model be used as a tool to investigate the relative effects of various policy options, rather than to present a definitive plan for satisfying the nation's energy requirements.

  6. A general mixture model for mapping quantitative trait loci by using molecular markers

    NARCIS (Netherlands)

    Jansen, R.C.

    1992-01-01

    In a segregating population a quantitative trait may be considered to follow a mixture of (normal) distributions, the mixing proportions being based on Mendelian segregation rules. A general and flexible mixture model is proposed for mapping quantitative trait loci (QTLs) by using molecular markers.

  7. Sloppy-model universality class and the Vandermonde matrix.

    Science.gov (United States)

    Waterfall, Joshua J; Casey, Fergal P; Gutenkunst, Ryan N; Brown, Kevin S; Myers, Christopher R; Brouwer, Piet W; Elser, Veit; Sethna, James P

    2006-10-13

    In a variety of contexts, physicists study complex, nonlinear models with many unknown or tunable parameters to explain experimental data. We explain why such systems so often are sloppy: the system behavior depends only on a few "stiff" combinations of the parameters and is unchanged as other "sloppy" parameter combinations vary by orders of magnitude. We observe that the eigenvalue spectra for the sensitivity of sloppy models have a striking, characteristic form with a density of logarithms of eigenvalues which is roughly constant over a large range. We suggest that the common features of sloppy models indicate that they may belong to a common universality class. In particular, we motivate focusing on a Vandermonde ensemble of multiparameter nonlinear models and show in one limit that they exhibit the universal features of sloppy models.

  8. A Review of Research on Universal Design Educational Models

    Science.gov (United States)

    Rao, Kavita; Ok, Min Wook; Bryant, Brian R.

    2014-01-01

    Universal design for learning (UDL) has gained considerable attention in the field of special education, acclaimed for its promise to promote inclusion by supporting access to the general curriculum. In addition to UDL, there are two other universal design (UD) educational models referenced in the literature, universal design of instruction (UDI)…

  9. Corporatized Higher Education: A Quantitative Study Examining Faculty Motivation Using Self-Determination Theory

    Science.gov (United States)

    Brown, Aaron D.

    2016-01-01

    The intent of this research is to offer a quantitative analysis of self-determined faculty motivation within the current corporate model of higher education across public and private research universities. With such a heightened integration of accountability structures, external reward systems, and the ongoing drive for more money and…

  10. Quantitative modeling of gene networks of biological systems using fuzzy Petri nets and fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raed I. Hamed

    2018-01-01

    Full Text Available Quantitative demonstrating of organic frameworks has turned into an essential computational methodology in the configuration of novel and investigation of existing natural frameworks. Be that as it may, active information that portrays the framework's elements should be known keeping in mind the end goal to get pertinent results with the routine displaying strategies. This information is frequently robust or even difficult to get. Here, we exhibit a model of quantitative fuzzy rational demonstrating approach that can adapt to obscure motor information and hence deliver applicable results despite the fact that dynamic information is fragmented or just dubiously characterized. Besides, the methodology can be utilized as a part of the blend with the current cutting edge quantitative demonstrating strategies just in specific parts of the framework, i.e., where the data are absent. The contextual analysis of the methodology suggested in this paper is performed on the model of nine-quality genes. We propose a kind of FPN model in light of fuzzy sets to manage the quantitative modeling of biological systems. The tests of our model appear that the model is practical and entirely powerful for information impersonation and thinking of fuzzy expert frameworks.

  11. Self-bridging of vertical silicon nanowires and a universal capacitive force model for spontaneous attraction in nanostructures.

    Science.gov (United States)

    Sun, Zhelin; Wang, Deli; Xiang, Jie

    2014-11-25

    Spontaneous attractions between free-standing nanostructures have often caused adhesion or stiction that affects a wide range of nanoscale devices, particularly nano/microelectromechanical systems. Previous understandings of the attraction mechanisms have included capillary force, van der Waals/Casimir forces, and surface polar charges. However, none of these mechanisms universally applies to simple semiconductor structures such as silicon nanowire arrays that often exhibit bunching or adhesions. Here we propose a simple capacitive force model to quantitatively study the universal spontaneous attraction that often causes stiction among semiconductor or metallic nanostructures such as vertical nanowire arrays with inevitably nonuniform size variations due to fabrication. When nanostructures are uniform in size, they share the same substrate potential. The presence of slight size differences will break the symmetry in the capacitive network formed between the nanowires, substrate, and their environment, giving rise to electrostatic attraction forces due to the relative potential difference between neighboring wires. Our model is experimentally verified using arrays of vertical silicon nanowire pairs with varied spacing, diameter, and size differences. Threshold nanowire spacing, diameter, or size difference between the nearest neighbors has been identified beyond which the nanowires start to exhibit spontaneous attraction that leads to bridging when electrostatic forces overcome elastic restoration forces. This work illustrates a universal understanding of spontaneous attraction that will impact the design, fabrication, and reliable operation of nanoscale devices and systems.

  12. Faster universal modeling for two source classes

    NARCIS (Netherlands)

    Nowbakht, A.; Willems, F.M.J.; Macq, B.; Quisquater, J.-J.

    2002-01-01

    The Universal Modeling algorithms proposed in [2] for two general classes of finite-context sources are reviewed. The above methods were constructed by viewing a model structure as a partition of the context space and realizing that a partition can be reached through successive splits. Here we start

  13. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  14. Baryogenesis model predicting antimatter in the Universe

    International Nuclear Information System (INIS)

    Kirilova, D.

    2003-01-01

    Cosmic ray and gamma-ray data do not rule out antimatter domains in the Universe, separated at distances bigger than 10 Mpc from us. Hence, it is interesting to analyze the possible generation of vast antimatter structures during the early Universe evolution. We discuss a SUSY-condensate baryogenesis model, predicting large separated regions of matter and antimatter. The model provides generation of the small locally observed baryon asymmetry for a natural initial conditions, it predicts vast antimatter domains, separated from the matter ones by baryonically empty voids. The characteristic scale of antimatter regions and their distance from the matter ones is in accordance with observational constraints from cosmic ray, gamma-ray and cosmic microwave background anisotropy data

  15. Measuring effectiveness of a university by a parallel network DEA model

    Science.gov (United States)

    Kashim, Rosmaini; Kasim, Maznah Mat; Rahman, Rosshairy Abd

    2017-11-01

    Universities contribute significantly to the development of human capital and socio-economic improvement of a country. Due to that, Malaysian universities carried out various initiatives to improve their performance. Most studies have used the Data Envelopment Analysis (DEA) model to measure efficiency rather than effectiveness, even though, the measurement of effectiveness is important to realize how effective a university in achieving its ultimate goals. A university system has two major functions, namely teaching and research and every function has different resources based on its emphasis. Therefore, a university is actually structured as a parallel production system with its overall effectiveness is the aggregated effectiveness of teaching and research. Hence, this paper is proposing a parallel network DEA model to measure the effectiveness of a university. This model includes internal operations of both teaching and research functions into account in computing the effectiveness of a university system. In literature, the graduate and the number of program offered are defined as the outputs, then, the employed graduates and the numbers of programs accredited from professional bodies are considered as the outcomes for measuring the teaching effectiveness. Amount of grants is regarded as the output of research, while the different quality of publications considered as the outcomes of research. A system is considered effective if only all functions are effective. This model has been tested using a hypothetical set of data consisting of 14 faculties at a public university in Malaysia. The results show that none of the faculties is relatively effective for the overall performance. Three faculties are effective in teaching and two faculties are effective in research. The potential applications of the parallel network DEA model allow the top management of a university to identify weaknesses in any functions in their universities and take rational steps for improvement.

  16. Are Universities Role Models for Communities? A Gender Perspective

    OpenAIRE

    Felicia Cornelia MACARIE; Octavian MOLDOVAN

    2012-01-01

    The present paper explores the degree in which universities could/should serve as role models for communities from the perspective of gender integration. Although the theoretical/ moral answer would be affirmative (universities should be in such a position that would allow local communities to regard them as role models of gender integration), the primary empirical analysis leads to another conclusion. A brief theoretical review (that connects gender discrimination, sustainable development, u...

  17. Statistical effect of interactions on particle creation in expanding universe

    International Nuclear Information System (INIS)

    Kodama, Hideo

    1982-01-01

    The statistical effect of interactions which drives many-particle systems toward equilibrium is expected to change the qualitative and quantitative features of particle creation in expanding universe. To investigate this problem a simplified model called the finite-time reduction model is formulated and applied to the scalar particle creation in the radiation dominant Friedmann universe. The number density of created particles and the entropy production due to particle creation are estimated. The result for the number density is compared with that in the conventional free field theory. It is shown that the statistical effect increases the particle creation and lengthens the active creation period. As for the entropy production it is shown that it is negligible for scalar particles in the Friedmann universe. (author)

  18. University Advertising and Universality in Messaging

    Science.gov (United States)

    Diel, Stan R.; Katsinas, Stephen

    2018-01-01

    University and college institutional advertisements, which typically are broadcast as public service announcements during the halftime of football games, were the subject of a quantitative analysis focused on commonality in messaging and employment of the semiotic theory of brand advertising. Findings indicate advertisements focus on students'…

  19. Are Universities Role Models for Communities? A Gender Perspective

    Directory of Open Access Journals (Sweden)

    Felicia Cornelia MACARIE

    2012-12-01

    Full Text Available The present paper explores the degree in which universities could/should serve as role models for communities from the perspective of gender integration. Although the theoretical/ moral answer would be affirmative (universities should be in such a position that would allow local communities to regard them as role models of gender integration, the primary empirical analysis leads to another conclusion. A brief theoretical review (that connects gender discrimination, sustainable development, universities and local communities is followed by an empirical analysis that compares the management structures of 12 Romanian Universities of Advanced Research and Education (the best Romanian universities according to a national ranking with those of four local communities where they are located (as geographic proximity would lead to a better diffusion of best practices. Contrary to initial expectations, even in higher education institutions, women are underrepresented both in executive and legislative positions. Since universities are subject to the same major patterns of gender discrimination (such as role theory, glass ceiling and glass elevator as private and public organizations, they lose the moral high ground that theory would suggest. However, medicine and pharmacy universities that can be connected with the traditional roles attributed to women provide better gender integration, but glass escalator phenomena remain present even in these limited fields.

  20. Quantitative Analysis of Criteria in University Building Maintenance in Malaysia

    Directory of Open Access Journals (Sweden)

    Olanrewaju Ashola Abdul-Lateef

    2010-10-01

    Full Text Available University buildings are a significant part of university assets and considerable resources are committed to their design, construction and maintenance. The core of maintenance management is to optimize productivity and user satisfaction with optimum resources. An important segment in the maintenance management system is the analysis of criteria that influence building maintenance. Therefore, this paper aims to identify quantify, rank and discuss the criteria that influence maintenance costs, maintenance backlogs, productivity and user satisfaction in Malaysian university buildings. The paper reviews the related literature and presents the outcomes of a questionnaire survey. Questionnaires were administered on 50 university maintenance organizations. Thirty-one criteria were addressed to the university maintenance organizations to evaluate the degree to which each of the criteria influences building maintenance management. With a 66% response rate, it was concluded that the consideration of the criteria is critical to the university building maintenance management system. The quality of components and materials, budget constraints and the age of the building were found to be the most influential criteria but information on user performance satisfaction, problems associated with in-house workforce and shortage of materials and components were the least influential criteria. The paper also outlined that maintenance management is a strategic function in university administration.

  1. Qualitative and quantitative combined nonlinear dynamics model and its application in analysis of price, supply–demand ratio and selling rate

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    The qualitative and quantitative combined nonlinear dynamics model proposed in this paper fill the gap in nonlinear dynamics model in terms of qualitative and quantitative combined methods, allowing the qualitative model and quantitative model to perfectly combine and overcome their weaknesses by learning from each other. These two types of models use their strengths to make up for the other’s deficiencies. The qualitative and quantitative combined models can surmount the weakness that the qualitative model cannot be applied and verified in a quantitative manner, and the high costs and long time of multiple construction as well as verification of the quantitative model. The combined model is more practical and efficient, which is of great significance for nonlinear dynamics. The qualitative and quantitative combined modeling and model analytical method raised in this paper is not only applied to nonlinear dynamics, but can be adopted and drawn on in the modeling and model analysis of other fields. Additionally, the analytical method of qualitative and quantitative combined nonlinear dynamics model proposed in this paper can satisfactorily resolve the problems with the price system’s existing nonlinear dynamics model analytical method. The three-dimensional dynamics model of price, supply–demand ratio and selling rate established in this paper make estimates about the best commodity prices using the model results, thereby providing a theoretical basis for the government’s macro-control of price. Meanwhile, this model also offer theoretical guidance to how to enhance people’s purchasing power and consumption levels through price regulation and hence to improve people’s living standards.

  2. Probing Models of Dark Matter and the Early Universe

    Science.gov (United States)

    Orlofsky, Nicholas David

    This thesis discusses models for dark matter (DM) and their behavior in the early universe. An important question is how phenomenological probes can directly search for signals of DM today. Another topic of investigation is how the DM and other processes in the early universe must evolve. Then, astrophysical bounds on early universe dynamics can constrain DM. We will consider these questions in the context of three classes of DM models--weakly interacting massive particles (WIMPs), axions, and primordial black holes (PBHs). Starting with WIMPs, we consider models where the DM is charged under the electroweak gauge group of the Standard Model. Such WIMPs, if generated by a thermal cosmological history, are constrained by direct detection experiments. To avoid present or near-future bounds, the WIMP model or cosmological history must be altered in some way. This may be accomplished by the inclusion of new states that coannihilate with the WIMP or a period of non-thermal evolution in the early universe. Future experiments are likely to probe some of these altered scenarios, and a non-observation would require a high degree of tuning in some of the model parameters in these scenarios. Next, axions, as light pseudo-Nambu-Goldstone bosons, are susceptible to quantum fluctuations in the early universe that lead to isocurvature perturbations, which are constrained by observations of the cosmic microwave background (CMB). We ask what it would take to allow axion models in the face of these strong CMB bounds. We revisit models where inflationary dynamics modify the axion potential and discuss how isocurvature bounds can be relaxed, elucidating the difficulties in these constructions. Avoiding disruption of inflationary dynamics provides important limits on the parameter space. Finally, PBHs have received interest in part due to observations by LIGO of merging black hole binaries. We ask how these PBHs could arise through inflationary models and investigate the opportunity

  3. INTRAVAL Phase 2: Modeling testing at the Las Cruces Trench Site

    International Nuclear Information System (INIS)

    Hills, R.G.; Rockhold, M.; Xiang, J.; Scanlon, B.; Wittmeyer, G.

    1994-01-01

    Several field experiments have been performed by scientists from the University of Arizona and New Mexico State University at the Las Cruces Trench Site to provide data tc test deterministic and stochastic models for water flow and solute transport. These experiments were performed in collaboration with INTRAVAL, an international effort toward validation of geosphere models for the transport of radionuclides. During Phase I of INTRAVAL, qualitative comparisons between experimental data and model predictions were made using contour plots of water contents and solute concentrations. Detailed quantitative comparisons were not made. To provide data for more rigorous model testing, a third Las Cruces Trench experiment was designed by scientists from the University of Arizona and New Mexico State University. Modelers from the Center for Nuclear Waste Regulatory Analysis, Massachusetts Institute of Technology, New Mexico State University, Pacific Northwest Laboratory, and the University of Texas provided predictions of water flow and tritium transport to New Mexico State University for analysis. The corresponding models assumed soil characterizations ranging from uniform to deterministically heterogeneous to stochastic. This report presents detailed quantitative comparisons to field data

  4. Modeling Environmental Literacy of University Students

    Science.gov (United States)

    Teksoz, Gaye; Sahin, Elvan; Tekkaya-Oztekin, Ceren

    2012-01-01

    The present study proposed an Environmental Literacy Components Model to explain how environmental attitudes, environmental responsibility, environmental concern, and environmental knowledge as well as outdoor activities related to each other. A total of 1,345 university students responded to an environmental literacy survey (Kaplowitz and Levine…

  5. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    International Nuclear Information System (INIS)

    Bindschadler, Michael; Alessio, Adam M; Modgil, Dimple; La Riviere, Patrick J; Branch, Kelley R

    2014-01-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g) −1 , cardiac output = 3, 5, 8 L min −1 ). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This

  6. Intermediate care: for better or worse? Process evaluation of an intermediate care model between a university hospital and a residential home

    Directory of Open Access Journals (Sweden)

    Janmaat Tonnie ACM

    2005-05-01

    Full Text Available Abstract Background Intermediate care was developed in order to bridge acute, primary and social care, primarily for elderly persons with complex care needs. Such bridging initiatives are intended to reduce hospital stays and improve continuity of care. Although many models assume positive effects, it is often ambiguous what the benefits are and whether they can be transferred to other settings. This is due to the heterogeneity of intermediate care models and the variety of collaborating partners that set up such models. Quantitative evaluation captures only a limited series of generic structure, process and outcome parameters. More detailed information is needed to assess the dynamics of intermediate care delivery, and to find ways to improve the quality of care. Against this background, the functioning of a low intensity early discharge model of intermediate care set up in a residential home for patients released from an Amsterdam university hospital has been evaluated. The aim of this study was to produce knowledge for management to improve quality of care, and to provide more generalisable insights into the accumulated impact of such a model. Methods A process evaluation was carried out using quantitative and qualitative methods. Registration forms and patient questionnaires were used to quantify the patient population in the model. Statistical analysis encompassed T-tests and chi-squared test to assess significance. Semi-structured interviews were conducted with 21 staff members representing all disciplines working with the model. Interviews were transcribed and analysed using both 'open' and 'framework' approaches. Results Despite high expectations, there were significant problems. A heterogeneous patient population, a relatively unqualified staff and cultural differences between both collaborating partners impeded implementation and had an impact on the functioning of the model. Conclusion We concluded that setting up a low intensity

  7. Rates of Student Disciplinary Action in Australian Universities

    Science.gov (United States)

    Lindsay, Bruce

    2010-01-01

    Although a growing body of research has been conducted on student misconduct in universities, quantitative data on disciplinary action undertaken by institutions against student transgressions are largely absent from the literature. This paper provides baseline quantitative data on disciplinary action against students in the universities. It is…

  8. Quantitative Literacy Courses as a Space for Fusing Literacies

    Science.gov (United States)

    Tunstall, Samuel Luke; Matz, Rebecca L.; Craig, Jeffrey C.

    2016-01-01

    In this article, we examine how students in a general education quantitative literacy course reason with public issues when unprompted to use quantitative reasoning. Michigan State University, like many institutions, not only has a quantitative literacy requirement for all undergraduates but also offers two courses specifically for meeting the…

  9. Roles of University Support for International Students in the United States: Analysis of a Systematic Model of University Identification, University Support, and Psychological Well-Being

    Science.gov (United States)

    Cho, Jaehee; Yu, Hongsik

    2015-01-01

    Unlike previous research on international students' social support, this current study applied the concept of organizational support to university contexts, examining the effects of university support. Mainly based on the social identity/self-categorization stress model, this study developed and tested a path model composed of four key…

  10. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  11. Assessing College Students’ Quantitative and Scientific Reasoning: The James Madison University Story

    Directory of Open Access Journals (Sweden)

    John D. Hathcoat

    2015-01-01

    Full Text Available Quantitative and scientific reasoning is a critical student learning outcome in higher education. Data are presented for large samples of undergraduate students who were assessed as entering freshmen and then again after completing 45-70 credit hours. Results are presented around four key issues that are central to educational assessment. First, entering freshmen with transfer credits for quantitative and scientific reasoning courses that fulfill general education requirements, on average, score similar to entering freshmen without such credit. About 97% of entering freshmen who had transfer credits received their credits through dual enrollment programs. As a sophomore-junior, students who had completed their general education requirements performed similar to students who had started, but not yet finished these requirements. Second, small to moderate correlations were observed between grade-point averages in relevant general education coursework and quantitative and scientific reasoning. Third, students’ quantitative and scientific reasoning, on average, increases from freshmen to sophomore/junior years. Finally, the proportion of students who meet faculty-set standards substantially increases from pre-test to post-test. Taken together, results suggest that changes in quantitative and scientific reasoning are a function of relevant courses. Additional research is needed to examine the role of lower-level versus higher-level courses in student performance. Results also indicate a need to investigate how differences in the quality of dual enrollment courses facilitate quantitative and scientific reasoning.

  12. Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.

    Science.gov (United States)

    Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P

    2017-01-11

    Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.

  13. Explaining formation of Astronomical Jets using Dynamic Universe Model

    Science.gov (United States)

    Naga Parameswara Gupta, Satyavarapu

    2016-07-01

    Astronomical jets are observed from the centres of many Galaxies including our own Milkyway. The formation of such jet is explained using SITA simulations of Dynamic Universe Model. For this purpose the path traced by a test neutron is calculated and depicted using a set up of one densemass of the mass equivalent to mass of Galaxy center, 90 stars with similar masses of stars near Galaxy center, mass equivalents of 23 Globular Cluster groups, 16 Milkyway parts, Andromeda and Triangulum Galaxies at appropriate distances. Five different kinds of theoretical simulations gave positive results The path travelled by this test neutron was found to be an astronomical jet emerging from Galaxy center. This is another result from Dynamic Universe Model. It solves new problems like a. Variable Mass Rocket Trajectory Problem b. Explaining Very long baseline interferometry (VLBI) observations c. Astronomical jets observed from Milkyway Center d. Prediction of Blue shifted Galaxies e. Explaining Pioneer Anomaly f. Prediction of New Horizons satellite trajectory etc. Dynamic Universe Model never reduces to General relativity on any condition. It uses a different type of mathematics based on Newtonian physics. This mathematics used here is simple and straightforward. As there are no differential equations present in Dynamic Universe Model, the set of equations give single solution in x y z Cartesian coordinates for every point mass for every time step

  14. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  15. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    Science.gov (United States)

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  16. DISTANCE AS KEY FACTOR IN MODELLING STUDENTS’ RECRUITMENT BY UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    SIMONA MĂLĂESCU

    2015-10-01

    Full Text Available Distance as Key Factor in Modelling Students’ Recruitment by Universities. In a previous paper analysing the challenge of keeping up with the current methodologies in the analysis and modelling of students’ recruitment by universities in the case of some ECE countries which still don’t register or develop key data to take advantage from the state of the art knowledge on the domain, we have promised to approach the factor distance in a future work due to the extent of the topic. This paper fulfill that promise bringing a review of the literature especially dealing with modelling the geographical area of recruiting students of an university, where combining distance with the proximate key factors previously reviewed, complete the meta-analysis of existing literature we have started a year ago. Beyond the theoretical benefit from a practical perspective, the metaanalysis aimed at synthesizing elements of good practice that can be applied to the local university system.

  17. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... detailed information on the estimated health impact in a given exposure situation. These graphs will facilitate the discussions on appropriate risk reduction measures to be taken....

  18. A Physical – Geometrical Model of an Early Universe

    Directory of Open Access Journals (Sweden)

    Corneliu BERBENTE

    2014-12-01

    Full Text Available A physical-geometrical model for a possible early universe is proposed. One considers an initial singularity containing the energy of the whole universe. The singularity expands as a spherical wave at the speed of light generating space and time. The relations of the special theory of relativity, quantum mechanics and gas kinetics are considered applicable. A structuring of the primary wave is adopted on reasons of geometrical simplicity as well as on satisfying the conservation laws. The evolution is able to lead to particles very close to neutrons as mass and radius. The actually admitted values for the radius and mass of the universe as well as the temperature of the ground radiation (3-5 K can be obtained by using the proposed model.

  19. Education for sustainability: A new challenge for the current university model

    Directory of Open Access Journals (Sweden)

    Ana Fernández Pérez

    2018-01-01

    Full Text Available Education for Sustainable Development aims to disseminate and promote a set of principles and values within the university model through management, teaching, research and university extension. It does not focus on a specific area but covers many areas such as equality, peace, health, sustainable urbanization, the environment. The objective of this study is to make an appeal in all these areas so that universities incorporate the dimension of sustainability in their curricula, through teaching, research and university management. For this, the different international and regional initiatives that have emphasized the need for Universities to be committed to the culture of sustainability and their inclusion in the current university model have been analyzed. The work will conclude with the idea that a sustainable development is perhaps one of the key pieces in the conception of the University of the 21st century.

  20. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    Science.gov (United States)

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  1. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  2. Universality in a Neutral Evolution Model

    Science.gov (United States)

    King, Dawn; Scott, Adam; Maric, Nevena; Bahar, Sonya

    2013-03-01

    Agent-based models are ideal for investigating the complex problems of biodiversity and speciation because they allow for complex interactions between individuals and between individuals and the environment. Presented here is a ``null'' model that investigates three mating types - assortative, bacterial, and random - in phenotype space, as a function of the percentage of random death δ. Previous work has shown phase transition behavior in an assortative mating model with variable fitness landscapes as the maximum mutation size (μ) was varied (Dees and Bahar, 2010). Similarly, this behavior was recently presented in the work of Scott et al. (submitted), on a completely neutral landscape, for bacterial-like fission as well as for assortative mating. Here, in order to achieve an appropriate ``null'' hypothesis, the random death process was changed so each individual, in each generation, has the same probability of death. Results show a continuous nonequilibrium phase transition for the order parameters of the population size and the number of clusters (analogue of species) as δ is varied for three different mutation sizes of the system. The system shows increasing robustness as μ increases. Universality classes and percolation properties of this system are also explored. This research was supported by funding from: University of Missouri Research Board and James S. McDonnell Foundation

  3. A Universal Model of Giftedness--An Adaptation of the Munich Model

    Science.gov (United States)

    Jessurun, J. H.; Shearer, C. B.; Weggeman, M. C. D. P.

    2016-01-01

    The Munich Model of Giftedness (MMG) by Heller and his colleagues, developed for the identification of gifted children, is adapted and expanded, with the aim of making it more universally usable as a model for the pathway from talents to performance. On the side of the talent-factors, the concept of multiple intelligences is introduced, and the…

  4. Universality in generalized models of inflation

    Energy Technology Data Exchange (ETDEWEB)

    Binétruy, P.; Pieroni, M. [AstroParticule et Cosmologie, Université Paris Diderot, CNRS, CEA, Observatoire de Paris, Sorbonne Paris Cité, 10, rue Alice Domon et Léonie Duquet, F-75205 Paris Cedex 13 (France); Mabillard, J., E-mail: pierre.binetruy@apc.univ-paris7.fr, E-mail: joel.mabillard@ed.ac.uk, E-mail: mauro.pieroni@apc.in2p3.fr [School of Physics and Astronomy, University of Edinburgh, Edinburgh, EH9 3JZ (United Kingdom)

    2017-03-01

    We discuss the cosmological evolution of a scalar field with non standard kinetic term in terms of a Renormalization Group Equation (RGE). In this framework inflation corresponds to the slow evolution in a neighborhood of a fixed point and universality classes for inflationary models naturally arise. Using some examples we show the application of the formalism. The predicted values for the speed of sound c {sub s} {sup 2} and for the amount of non-Gaussianities produced in these models are discussed. In particular, we show that it is possible to introduce models with c {sub s} {sup 2} ≠ 1 that can be in agreement with present cosmological observations.

  5. Quantitative aspects and dynamic modelling of glucosinolate metabolism

    DEFF Research Database (Denmark)

    Vik, Daniel

    . This enables comparison of transcript and protein levels across mutants and upon induction. I find that unchallenged plants show good correspondence between protein and transcript, but that treatment with methyljasmonate results in significant differences (chapter 1). Functional genomics are used to study......). The construction a dynamic quantitative model of GLS hydrolysis is described. Simulations reveal potential effects on auxin signalling that could reflect defensive strategies (chapter 4). The results presented grant insights into, not only the dynamics of GLS biosynthesis and hydrolysis, but also the relationship...

  6. Time-symmetric universe model and its observational implication

    Energy Technology Data Exchange (ETDEWEB)

    Futamase, T.; Matsuda, T.

    1987-08-01

    A time-symmetric closed-universe model is discussed in terms of the radiation arrow of time. The time symmetry requires the occurrence of advanced waves in the recontracting phase of the Universe. We consider the observational consequences of such advanced waves, and it is shown that a test observer in the expanding phase can observe a time-reversed image of a source of radiation in the future recontracting phase.

  7. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    Science.gov (United States)

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  8. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    Science.gov (United States)

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (pquantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.

  9. Universality classes for models of inflation

    CERN Document Server

    Binetruy, P.; Mabillard, J.; Pieroni, M.; Rosset, C.

    2015-01-01

    We show that the cosmological evolution of a scalar field in a potential can be obtained from a renormalisation group equation. The slow roll regime of inflation models is understood in this context as the slow evolution close to a fixed point, described by the methods of renormalisation group. This explains in part the universality observed in the predictions of a certain number of inflation models. We illustrate this behavior on a certain number of examples and discuss it in the context of the AdS/CFT correspondence.

  10. Toward University Modeling Instruction--Biology: Adapting Curricular Frameworks from Physics to Biology

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER)…

  11. Quantitative modelling of HDPE spurt experiments using wall slip and generalised Newtonian flow

    NARCIS (Netherlands)

    Doelder, den C.F.J.; Koopmans, R.J.; Molenaar, J.

    1998-01-01

    A quantitative model to describe capillary rheometer experiments is presented. The model can generate ‘two-branched' discontinuous flow curves and the associated pressure oscillations. Polymer compressibility in the barrel, incompressible axisymmetric generalised Newtonian flow in the die, and a

  12. Sustainability and scalability of university spinouts:a business model perspective

    OpenAIRE

    Ziaee Bigdeli, Ali; Li, Feng; Shi, Xiaohui

    2015-01-01

    Most previous studies of university spinouts (USOs) have focused on what determines their formation from the perspectives of the entrepreneurs or of their parent universities. However, few studies have investigated how these entrepreneurial businesses actually grow and how their business models evolve in the process. This paper examines the evolution of USOs' business models over their different development phases. Using empirical evidence gathered from three comprehensive case studies, we ex...

  13. A quantitative and dynamic model of the Arabidopsis flowering time gene regulatory network.

    Directory of Open Access Journals (Sweden)

    Felipe Leal Valentim

    Full Text Available Various environmental signals integrate into a network of floral regulatory genes leading to the final decision on when to flower. Although a wealth of qualitative knowledge is available on how flowering time genes regulate each other, only a few studies incorporated this knowledge into predictive models. Such models are invaluable as they enable to investigate how various types of inputs are combined to give a quantitative readout. To investigate the effect of gene expression disturbances on flowering time, we developed a dynamic model for the regulation of flowering time in Arabidopsis thaliana. Model parameters were estimated based on expression time-courses for relevant genes, and a consistent set of flowering times for plants of various genetic backgrounds. Validation was performed by predicting changes in expression level in mutant backgrounds and comparing these predictions with independent expression data, and by comparison of predicted and experimental flowering times for several double mutants. Remarkably, the model predicts that a disturbance in a particular gene has not necessarily the largest impact on directly connected genes. For example, the model predicts that SUPPRESSOR OF OVEREXPRESSION OF CONSTANS (SOC1 mutation has a larger impact on APETALA1 (AP1, which is not directly regulated by SOC1, compared to its effect on LEAFY (LFY which is under direct control of SOC1. This was confirmed by expression data. Another model prediction involves the importance of cooperativity in the regulation of APETALA1 (AP1 by LFY, a prediction supported by experimental evidence. Concluding, our model for flowering time gene regulation enables to address how different quantitative inputs are combined into one quantitative output, flowering time.

  14. Organizational Models and Mythologies of the American Research University. ASHE 1986 Annual Meeting Paper.

    Science.gov (United States)

    Alpert, Daniel

    Features of the matrix model of the research university and myths about the academic enterprise are described, along with serious dissonances in the U.S. university system. The linear model, from which the matrix model evolved, describes the university's structure, perceived mission, and organizational behavior. A matrix model portrays in concise,…

  15. Newtonian self-gravitating system in a relativistic huge void universe model

    Energy Technology Data Exchange (ETDEWEB)

    Nishikawa, Ryusuke; Nakao, Ken-ichi [Department of Mathematics and Physics, Graduate School of Science, Osaka City University, 3-3-138 Sugimoto, Sumiyoshi, Osaka 558-8585 (Japan); Yoo, Chul-Moon, E-mail: ryusuke@sci.osaka-cu.ac.jp, E-mail: knakao@sci.osaka-cu.ac.jp, E-mail: yoo@gravity.phys.nagoya-u.ac.jp [Division of Particle and Astrophysical Science, Graduate School of Science, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8602 (Japan)

    2016-12-01

    We consider a test of the Copernican Principle through observations of the large-scale structures, and for this purpose we study the self-gravitating system in a relativistic huge void universe model which does not invoke the Copernican Principle. If we focus on the the weakly self-gravitating and slowly evolving system whose spatial extent is much smaller than the scale of the cosmological horizon in the homogeneous and isotropic background universe model, the cosmological Newtonian approximation is available. Also in the huge void universe model, the same kind of approximation as the cosmological Newtonian approximation is available for the analysis of the perturbations contained in a region whose spatial size is much smaller than the scale of the huge void: the effects of the huge void are taken into account in a perturbative manner by using the Fermi-normal coordinates. By using this approximation, we derive the equations of motion for the weakly self-gravitating perturbations whose elements have relative velocities much smaller than the speed of light, and show the derived equations can be significantly different from those in the homogeneous and isotropic universe model, due to the anisotropic volume expansion in the huge void. We linearize the derived equations of motion and solve them. The solutions show that the behaviors of linear density perturbations are very different from those in the homogeneous and isotropic universe model.

  16. Inflationary universe models and the formation of structure

    International Nuclear Information System (INIS)

    Brandenberger, R.H.

    1987-01-01

    The main features of inflationary universe models are briefly reviewed. Inflation provides a mechanism which produces energy density fluctuations on cosmological scales. In the original models, it was not possible to obtain the correct magnitude of these fluctuations without fine tuning the particle physics models. Two mechanisms, chaotic inflation, and a dynamical relaxation process are discussed by which inflation may be realized in models which give the right magnitude of fluctuations. 22 references

  17. Wires in the soup: quantitative models of cell signaling

    Science.gov (United States)

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  18. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  19. A quantitative analysis of instabilities in the linear chiral sigma model

    International Nuclear Information System (INIS)

    Nemes, M.C.; Nielsen, M.; Oliveira, M.M. de; Providencia, J. da

    1990-08-01

    We present a method to construct a complete set of stationary states corresponding to small amplitude motion which naturally includes the continuum solution. The energy wheighted sum rule (EWSR) is shown to provide for a quantitative criterium on the importance of instabilities which is known to occur in nonasymptotically free theories. Out results for the linear σ model showed be valid for a large class of models. A unified description of baryon and meson properties in terms of the linear σ model is also given. (author)

  20. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  1. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  2. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  3. A universal, fault-tolerant, non-linear analytic network for modeling and fault detection

    International Nuclear Information System (INIS)

    Mott, J.E.; King, R.W.; Monson, L.R.; Olson, D.L.; Staffon, J.D.

    1992-01-01

    The similarities and differences of a universal network to normal neural networks are outlined. The description and application of a universal network is discussed by showing how a simple linear system is modeled by normal techniques and by universal network techniques. A full implementation of the universal network as universal process modeling software on a dedicated computer system at EBR-II is described and example results are presented. It is concluded that the universal network provides different feature recognition capabilities than a neural network and that the universal network can provide extremely fast, accurate, and fault-tolerant estimation, validation, and replacement of signals in a real system

  4. A universal, fault-tolerant, non-linear analytic network for modeling and fault detection

    Energy Technology Data Exchange (ETDEWEB)

    Mott, J.E. [Advanced Modeling Techniques Corp., Idaho Falls, ID (United States); King, R.W.; Monson, L.R.; Olson, D.L.; Staffon, J.D. [Argonne National Lab., Idaho Falls, ID (United States)

    1992-03-06

    The similarities and differences of a universal network to normal neural networks are outlined. The description and application of a universal network is discussed by showing how a simple linear system is modeled by normal techniques and by universal network techniques. A full implementation of the universal network as universal process modeling software on a dedicated computer system at EBR-II is described and example results are presented. It is concluded that the universal network provides different feature recognition capabilities than a neural network and that the universal network can provide extremely fast, accurate, and fault-tolerant estimation, validation, and replacement of signals in a real system.

  5. A fractal model of the Universe

    Science.gov (United States)

    Gottlieb, Ioan

    The book represents a revisioned, extended, completed and translated version of the book "Superposed Universes. A scientific novel and a SF story" (1995). The book contains a hypothesis by the author concerning the complexity of the Nature. An introduction to the theories of numbers, manyfolds and topology is given. The possible connection with the theory of evolution of the Universe is discussed. The book contains also in the last chapter a SF story based on the hypothesis presented. A connection with fractals theory is given. A part of his earlier studies (1955-1956) were subsequently published without citation by Ali Kyrala (Phys. Rev. vol.117, No.5, march 1, 1960). The book contains as an important appendix the early papers (some of which are published in the coauthoprship with his scientific advisors): 1) T.T. Vescan, A. Weiszmann and I.Gottlieb, Contributii la studiul problemelor geometrice ale teoriei relativitatii restranse. Academia R.P.R. Baza Timisoara. Lucrarile consfatuirii de geometrie diferentiala din 9-12 iunie 1955. In this paper the authors show a new method of the calculation of the metrics. 2) Jean Gottlieb, L'hyphotese d'un modele de la structure de la matiere, Revista Matematica y Fisica Teorica, Serie A, Volumen XY, No.1, y.2, 1964 3) I. Gottlieb, Some hypotheses on space, time and gravitation, Studies in Gravitation Theory, CIP Press, Bucharest, 1988, pp.227-234 as well as some recent papers (published in the coauthorship with his disciples): 4)M. Agop, Gottlieb speace-time. A fractal axiomatic model of the Universe. in Particles and Fields, Editors: M.Agop and P.D. Ioannou, Athens University Press, 2005, pp. 59-141 5) I. Gottlieb, M.Agop and V.Enache, Games with Cantor's dust. Chaos, Solitons and Fractals, vol.40 (2009) pp. 940-945 6) I. Gottlieb, My picture over the World, Bull. of the Polytechnic Institute of Iasi. Tom LVI)LX, Fasc. 1, 2010, pp. 1-18. The book contains also a dedication to father Vasile Gottlieb and wife Cleopatra

  6. An Evaluation of Service Quality in Higher Education: Marmara and Nigde Omer Halisdemir Universities' Department of Education Students

    Science.gov (United States)

    Ada, Sefer; Baysal, Z. Nurdan; Erkan, Senem Seda Sahenk

    2017-01-01

    The purpose of this research is to evaluate the quality service in higher education in Marmara and Nigde Omer Halisdemir Universities' department of education students. This study was prepared using a screening model from quantitative research methods. The sample of this research comprised 886 university students attending the higher education…

  7. A new approach to developing and optimizing organization strategy based on stochastic quantitative model of strategic performance

    Directory of Open Access Journals (Sweden)

    Marko Hell

    2014-03-01

    Full Text Available This paper presents a highly formalized approach to strategy formulation and optimization of strategic performance through proper resource allocation. A stochastic quantitative model of strategic performance (SQMSP is used to evaluate the efficiency of the strategy developed. The SQMSP follows the theoretical notions of the balanced scorecard (BSC and strategy map methodologies, initially developed by Kaplan and Norton. Parameters of the SQMSP are suggested to be random variables and be evaluated by experts who give two-point (optimistic and pessimistic values and three-point (optimistic, most probable and pessimistic values evaluations. The Monte-Carlo method is used to simulate strategic performance. Having been implemented within a computer application and applied to solve the real problem (planning of an IT-strategy at the Faculty of Economics, University of Split the proposed approach demonstrated its high potential as a basis for development of decision support tools related to strategic planning.

  8. Proven collaboration model for impact generating research with universities

    CSIR Research Space (South Africa)

    Bezuidenhout, DF

    2010-09-01

    Full Text Available -optics, image processing and computer vision. This paper presents the research collaboration model with universities that has ensured the PRISM programme's success. It is shown that this collaboration model has resulted in a pipeline of highly-skilled people...

  9. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  10. Designing a Mathematical Model for Allocating Budget to University Research and Educational Goals: A Case Study in Shahed University

    Directory of Open Access Journals (Sweden)

    Saeed Safari

    2012-07-01

    Full Text Available Institutions of higher education, both public and private, are among the most important institutions of a country. Several economic factors have forced them to act for improving the cost-effectiveness of their activities and the quality of their products (outputs is strongly expected. Such issues have led universities to focus on profit-making activities and commercialization like manufacturing industries. This propensity is grounded in the fact that manufacturing industries working under an efficient management system can produce very high-quality products. As a matter of fact, there is no such a model for academic contexts. Therefore, this paper is aimed at offering such a model. The coefficients and constants used in this model have all been extracted based on analyzing research and educational aspects of Shahed University. The proposed model is a lexicographic model which has thirty six decision variables that are broken down into two classes of university sources variables (fifteen and university products variables. The model also includes forty nine goals, seven structural constraints and twenty integer variables. At the end of the paper, the current situation is compared with the recommended one and it shows that many of the variables are suboptimal except variables of research and educational officials (S9, graduate (P7 and PhD (P9 night course students number. The comprehensiveness of this model enables managers to plan the smallest research and educational activities and the solutions can be used by managers as applied guidelines.

  11. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    Science.gov (United States)

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  12. A kinetic-based sigmoidal model for the polymerase chain reaction and its application to high-capacity absolute quantitative real-time PCR

    Directory of Open Access Journals (Sweden)

    Stewart Don

    2008-05-01

    Full Text Available Abstract Background Based upon defining a common reference point, current real-time quantitative PCR technologies compare relative differences in amplification profile position. As such, absolute quantification requires construction of target-specific standard curves that are highly resource intensive and prone to introducing quantitative errors. Sigmoidal modeling using nonlinear regression has previously demonstrated that absolute quantification can be accomplished without standard curves; however, quantitative errors caused by distortions within the plateau phase have impeded effective implementation of this alternative approach. Results Recognition that amplification rate is linearly correlated to amplicon quantity led to the derivation of two sigmoid functions that allow target quantification via linear regression analysis. In addition to circumventing quantitative errors produced by plateau distortions, this approach allows the amplification efficiency within individual amplification reactions to be determined. Absolute quantification is accomplished by first converting individual fluorescence readings into target quantity expressed in fluorescence units, followed by conversion into the number of target molecules via optical calibration. Founded upon expressing reaction fluorescence in relation to amplicon DNA mass, a seminal element of this study was to implement optical calibration using lambda gDNA as a universal quantitative standard. Not only does this eliminate the need to prepare target-specific quantitative standards, it relegates establishment of quantitative scale to a single, highly defined entity. The quantitative competency of this approach was assessed by exploiting "limiting dilution assay" for absolute quantification, which provided an independent gold standard from which to verify quantitative accuracy. This yielded substantive corroborating evidence that absolute accuracies of ± 25% can be routinely achieved. Comparison

  13. University Business Models and Online Practices: A Third Way

    Science.gov (United States)

    Rubin, Beth

    2013-01-01

    Higher Education is in a state of change, and the existing business models do not meet the needs of stakeholders. This article contrasts the current dominant business models of universities, comparing the traditional non-profit against the for-profit online model, examining the structural features and online teaching practices that underlie each.…

  14. Dynamic Universe Model Predicts the Trajectory of New Horizons Satellite Going to Pluto.......

    Science.gov (United States)

    Naga Parameswara Gupta, Satyavarapu

    2012-07-01

    New Horizons is NASA's artificial satellite now going towards to the dwarf planet Pluto. It has crossed Jupiter. It is expected to be the rst spacecraft to go near and study Pluto and its moons, Charon, Nix, and Hydra. These are the predictions for New Horizons (NH) space craft as on A.D. 2009-Aug-09 00:00:00.0000 hrs. The behavior of NH is similar to Pioneer Space craft as NH traveling is alike to Pioneer. NH is supposed to reach Pluto in 2015 AD. There was a gravity assist taken at Jupiter about a year back. As Dynamic universe model explains Pioneer anomaly and the higher gravitational attraction forces experienced towards SUN, It can explain NH also in a similar fashion. I am giving the predictions for NH by Dynamic Universe Model in the following Table 4. Here first two rows give Dynamic Universe Model predictions based on 02-01-2009 00:00 hrs data with Daily time step and hourly time step. Third row gives Ephemeris from Jet propulsion lab.Dynamic Universe Model can predict further to 9-Aug-2009. These Ephemeris data is from their web as on 28th June 2009 Any new data can be calculated..... For finding trajectories of Pioneer satellite (Anomaly), New Horizons satellite going to Pluto, the Calculations of Dynamic Universe model can be successfully applied. No dark matter is assumed within solar system radius. The effect on the masses around SUN shows as though there is extra gravitation pull toward SUN. It solves the Dynamics of Extra-solar planets like Planet X, satellite like Pioneer and NH for 3-Position, 3-velocity 3-acceleration for their masses,considering the complex situation of Multiple planets, Stars, Galaxy parts and Galaxy center and other Galaxies Using simple Newtonian Physics. It already solved problems Missing mass in Galaxies observed by galaxy circular velocity curves successfully. `SITA Simulations' software was developed about 18 years back for Dynamic Universe Model of Cosmology. It is based on Newtonian physics. It is Classical singularity

  15. Integration of CFD codes and advanced combustion models for quantitative burnout determination

    Energy Technology Data Exchange (ETDEWEB)

    Javier Pallares; Inmaculada Arauzo; Alan Williams [University of Zaragoza, Zaragoza (Spain). Centre of Research for Energy Resources and Consumption (CIRCE)

    2007-10-15

    CFD codes and advanced kinetics combustion models are extensively used to predict coal burnout in large utility boilers. Modelling approaches based on CFD codes can accurately solve the fluid dynamics equations involved in the problem but this is usually achieved by including simple combustion models. On the other hand, advanced kinetics combustion models can give a detailed description of the coal combustion behaviour by using a simplified description of the flow field, this usually being obtained from a zone-method approach. Both approximations describe correctly general trends on coal burnout, but fail to predict quantitative values. In this paper a new methodology which takes advantage of both approximations is described. In the first instance CFD solutions were obtained of the combustion conditions in the furnace in the Lamarmora power plant (ASM Brescia, Italy) for a number of different conditions and for three coals. Then, these furnace conditions were used as inputs for a more detailed chemical combustion model to predict coal burnout. In this, devolatilization was modelled using a commercial macromolecular network pyrolysis model (FG-DVC). For char oxidation an intrinsic reactivity approach including thermal annealing, ash inhibition and maceral effects, was used. Results from the simulations were compared against plant experimental values, showing a reasonable agreement in trends and quantitative values. 28 refs., 4 figs., 4 tabs.

  16. Value, What Value? University Business Model in Pursuit of Advanced Internationalization

    DEFF Research Database (Denmark)

    Juho, Anita; Turcan, Romeo V.

    Through business model theoretical lenses, we explore in this paper issues and challenges universities face in their pursuit of advanced internationalization entries into foreign markets. The context of this paper is defined by universities from developed countries entering developing or emerging...... countries via foreign direct investment entry modes, such as joint ventures, acquisitions, green field or brown field investments. This is a theoretical paper. We draw on a number of sources of data to conceptualise issues and challenges universities face in their pursuit of advanced internationalization...... entries into foreign markets. First, we build on university autonomy, international business and business model theories to conceptualise the phenomenon of interest. Second, we analyse publicly available data, anecdotal evidence where the phenomenon we study is explicitly observable. The above theoretical...

  17. Warm anisotropic inflationary universe model

    International Nuclear Information System (INIS)

    Sharif, M.; Saleem, Rabia

    2014-01-01

    This paper is devoted to the study of warm inflation using vector fields in the background of a locally rotationally symmetric Bianchi type I model of the universe. We formulate the field equations, and slow-roll and perturbation parameters (scalar and tensor power spectra as well as their spectral indices) in the slow-roll approximation. We evaluate all these parameters in terms of the directional Hubble parameter during the intermediate and logamediate inflationary regimes by taking the dissipation factor as a function of the scalar field as well as a constant. In each case, we calculate the observational parameter of interest, i.e., the tensor-scalar ratio in terms of the inflaton. The graphical behavior of these parameters shows that the anisotropic model is also compatible with WMAP7 and the Planck observational data. (orig.)

  18. Warm anisotropic inflationary universe model

    Energy Technology Data Exchange (ETDEWEB)

    Sharif, M.; Saleem, Rabia [University of the Punjab, Department of Mathematics, Lahore (Pakistan)

    2014-02-15

    This paper is devoted to the study of warm inflation using vector fields in the background of a locally rotationally symmetric Bianchi type I model of the universe. We formulate the field equations, and slow-roll and perturbation parameters (scalar and tensor power spectra as well as their spectral indices) in the slow-roll approximation. We evaluate all these parameters in terms of the directional Hubble parameter during the intermediate and logamediate inflationary regimes by taking the dissipation factor as a function of the scalar field as well as a constant. In each case, we calculate the observational parameter of interest, i.e., the tensor-scalar ratio in terms of the inflaton. The graphical behavior of these parameters shows that the anisotropic model is also compatible with WMAP7 and the Planck observational data. (orig.)

  19. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    Directory of Open Access Journals (Sweden)

    Cobbs Gary

    2012-08-01

    Full Text Available Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Results Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the

  20. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction.

    Science.gov (United States)

    Cobbs, Gary

    2012-08-16

    Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the literature. They also give better estimates of

  1. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  2. Quantitative Developments in Turkish Higher Education since 1933

    Directory of Open Access Journals (Sweden)

    Aslı GÜNAY

    2011-01-01

    Full Text Available In this study, quantitative developments in Turkish higher education during the Republic period from 1933, when the first university was established, to date are tried to be demonstrated. In parallel with this purpose, first, establishment dates of universities, number of universities by years as well as number of universities established during the periods of each presidents of Turkish Council of Higher Education are listed. Also, spread to all provinces as of 2008, the distribution of the number of universities with regard to provinces is given. On the other hand, development of Turkish higher education by years is examined by using several quantitative indicators about higher education. Thus, number of students in higher education, total number of academic staffs as well as those with PhD, improvement in the number of students per academic staff and higher education gross enrollment rates by years are shown. Furthermore, especially for big provinces in Turkey (Ankara, İstanbul and İzmir number of universities, number of students in higher education and higher education gross enrollment rates are provided. Distribution of higher education students according to higher education institutions, higher education programs and education types in 2011 is presented as well as distribution of academic staffs according to higher education institutions and information about their academic positions. In addition, quantitative data about higher education bachelor and associate degrees (numbers of programs types, programs, quotas and placed students in 2010 is given. Finally, the position of Turkish higher education in the world with respect to the number of academic publications and the change in the number of academic publications per staff by years are analyzed.

  3. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  4. Establishing a Business Process Reference Model for Universities

    KAUST Repository

    Svensson, Carsten

    2012-09-01

    Modern universities are by any standard complex organizations that, from an IT perspective, present a number of unique challenges. This paper will propose establishing a business process reference framework. The benefit to the users would be a better understanding of the system landscape, business process enablement, collection of performance data and systematic reuse of existing community experience and knowledge. For these reasons reference models such as the SCOR (Supply Chain Operations Reference), DCOR (Design Chain Operations Reference) and ITIL (Information Technology Infrastructure Library) have gained popularity among organizations in both the private and public sectors. We speculate that this success can be replicated in a university setting. Furthermore the paper will outline how the research group suggests moving ahead with the research which will lead to a reference model.

  5. A Theoretical Hypothesis on Ferris Wheel Model of University Social Responsibility

    OpenAIRE

    Le Kang

    2016-01-01

    According to the nature of the university, as a free and responsible academic community, USR is based on a different foundation —academic responsibility, so the Pyramid and the IC Model of CSR could not fully explain the most distinguished feature of USR. This paper sought to put forward a new model— Ferris Wheel Model, to illustrate the nature of USR and the process of achievement. The Ferris Wheel Model of USR shows the university creates a balanced, fairness and neutrality systemic structu...

  6. University Research in Support of TREAT Modeling and Simulation, FY 2016

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Idaho National Laboratory is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under the Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. In support of this research, INL is working with four universities to explore advanced solution methods that will complement or augment capabilities in MAMMOTH. This report consists of a collection of year end summaries of research from the universities performed in support of TREAT modeling and simulation. This research was led by Prof. Sedat Goluoglu at the University of Florida, Profs. Jim Morel and Jean Ragusa at Texas A&M University, Profs. Benoit Forget and Kord Smith at Massachusetts Institute of Technology, Prof. Leslie Kerby of Idaho State University and Prof. Barry Ganapol of University of Arizona. A significant number of students were supported at various levels though the projects and, for some, also as interns at INL.

  7. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  8. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    2000-07-01

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  9. Enhanced Tunnelling Models for Child Universe Formation

    CERN Document Server

    Ansoldi, S; Shilon, I

    2015-01-01

    Starting from a recently proposed model that allows for an enhanced rate of child universe production under generic conditions, we elaborate on refinements that may allow for non-singular initial configurations. A possibility to treat both, the initial state and the tunnelling beyond the semiclassical level will also be introduced.

  10. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  11. An Elaboration of a Strategic Alignment Model of University Information Systems based on SAM Model

    Directory of Open Access Journals (Sweden)

    S. Ahriz

    2018-02-01

    Full Text Available Information system is a guarantee of the universities' ability to anticipate the essential functions to their development and durability. The alignment of information system, one of the pillars of IT governance, has become a necessity. In this paper, we consider the problem of strategic alignment model implementation in Moroccan universities. Literature revealed that few studies have examined strategic alignment in the public sector, particularly in higher education institutions. Hence we opted for an exploratory approach that aims to better understanding the strategic alignment and to evaluate the degree of its use within Moroccan universities. The data gained primarily through interviews with top managers and IT managers reveal that the alignment is not formalized and that it would be appropriate to implement an alignment model. It is found that the implementation of our proposed model can help managers to maximize returns of IT investment and to increase their efficiency.

  12. A history of the universe in a superstring model

    International Nuclear Information System (INIS)

    Maeda, K.

    1986-07-01

    A superstring theory, which is most promising candidate for a unified theory, predicts a higher-dimensional 'space-time'. Its application to cosmology, especially reconsideration of the early history of the universe, is definitely important and interesting. Here, we discuss some scenario of the universe in a superstring model. Main problems in higher-dimensional unified theories, from the cosmological point of view, are: (i) Can the 4-dim Einstein gravity be obtained, rather than the Jordan-Brans-Dicke theory? (ii) Can the 4-dim Friedmann universe (F 4 ) be realized naturally in the higher-dimensional space-time? (iii) Does inflation really occur? The answers for (i) and (ii) are 'yes' in a superstring model, as we will see soon. (iii) is still an open question, although it seems to be difficult. Taking into account a quantum tunnelling effect of the anti-symmetric tensor field H μυρ , we also show that a hierarchical bubble structure might be formed due to a series of phase transitions

  13. NEAMS-Funded University Research in Support of TREAT Modeling and Simulation, FY15

    International Nuclear Information System (INIS)

    Dehart, Mark; Mausolff, Zander; Goluoglu, Sedat; Prince, Zach; Ragusa, Jean; Haugen, Carl; Ellis, Matt; Forget, Benoit; Smith, Kord; Alberti, Anthony; Palmer, Todd

    2015-01-01

    This report summarizes university research activities performed in support of TREAT modeling and simulation research. It is a compilation of annual research reports from four universities: University of Florida, Texas A&M University, Massachusetts Institute of Technology and Oregon State University. The general research topics are, respectively, (1) 3-D time-dependent transport with TDKENO/KENO-VI, (2) implementation of the Improved Quasi-Static method in Rattlesnake/MOOSE for time-dependent radiation transport approximations, (3) improved treatment of neutron physics representations within TREAT using OpenMC, and (4) steady state modeling of the minimum critical core of the Transient Reactor Test Facility (TREAT).

  14. NEAMS-Funded University Research in Support of TREAT Modeling and Simulation, FY15

    Energy Technology Data Exchange (ETDEWEB)

    Dehart, Mark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mausolff, Zander [Univ. of Florida, Gainesville, FL (United States); Goluoglu, Sedat [Univ. of Florida, Gainesville, FL (United States); Prince, Zach [Texas A & M Univ., College Station, TX (United States); Ragusa, Jean [Texas A & M Univ., College Station, TX (United States); Haugen, Carl [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Ellis, Matt [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Forget, Benoit [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Smith, Kord [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Alberti, Anthony [Oregon State Univ., Corvallis, OR (United States); Palmer, Todd [Oregon State Univ., Corvallis, OR (United States)

    2015-09-01

    This report summarizes university research activities performed in support of TREAT modeling and simulation research. It is a compilation of annual research reports from four universities: University of Florida, Texas A&M University, Massachusetts Institute of Technology and Oregon State University. The general research topics are, respectively, (1) 3-D time-dependent transport with TDKENO/KENO-VI, (2) implementation of the Improved Quasi-Static method in Rattlesnake/MOOSE for time-dependent radiation transport approximations, (3) improved treatment of neutron physics representations within TREAT using OpenMC, and (4) steady state modeling of the minimum critical core of the Transient Reactor Test Facility (TREAT).

  15. ONLINE MODEL OF EDUCATION QUALITY ASSURANCE EQUASP IMPLEMENTATION: EXPERIENCE OF VYATKA STATE UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Valentin Pugach

    2015-10-01

    Full Text Available The article is devoted to the problem of assessing the quality of higher education. In the Russian Federation recently quality assessment of educational services provided by state-accredited universities is carried out by the state represented by the Ministry of education and science. State universities have simulated internal systemseducation quality assessment in accordance with the methodology proposed by the Ministry of education and science. Currently more attention is paid to the independent assessment of education quality which is the basis of professional public accreditation. The project "EQUASP" financed within the framework of the TEMPUS programme is directed to the problem of implementing the methodology of the online model of independent higher education quality assessment in the practice of Russian universities. The proposed model for assessing the quality of education is based on usage of 5 standards. The authors have done a comparative analysis of the model of higher education quality assessment existing in Vyatka State University and the model of education quality assessing offered by European universities-participants of the project EQUASP. The authors have presented the main results of investigation of this problem and some suggestions for improving the model of education quality assessment used by Vyatka State University.

  16. Quantitative computational models of molecular self-assembly in systems biology.

    Science.gov (United States)

    Thomas, Marcus; Schwartz, Russell

    2017-05-23

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  17. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  18. Models for universal reduction of macroscopic quantum fluctuations

    International Nuclear Information System (INIS)

    Diosi, L.

    1988-10-01

    If quantum mechanics is universal, then macroscopic bodies would, in principle, possess macroscopic quantum fluctuations (MQF) in their positions, orientations, densities etc. Such MQF, however, are not observed in nature. The hypothesis is adopted that the absence of MQF is due to a certain universal mechanism. Gravitational measures were applied for reducing MQF of the mass density. This model leads to classical trajectories in the macroscopic limit of translational motion. For massive objects, unwanted macroscopic superpositions of quantum states will be destroyed within short times. (R.P.) 34 refs

  19. Establishing a business process reference model for Universities

    DEFF Research Database (Denmark)

    Svensson, Carsten; Hvolby, Hans-Henrik

    2012-01-01

    Modern universities are by any standard complex organizations that, from an IT perspective, present a number of unique challenges. This paper will propose establishing a business process reference framework. The benefit to the users would be a better understanding of the system landscape, business......) have gained popularity among organizations in both the private and public sectors. We speculate that this success can be replicated in a university setting. Furthermore the paper will outline how the research group suggests moving ahead with the research which will lead to a reference model....

  20. University education: From Humbolt's model to the Bologna process

    Directory of Open Access Journals (Sweden)

    Bodroški-Spariosu Biljana

    2015-01-01

    Full Text Available The characteristics of the European university education in the context of the Bologna process are the topic of this article. The aim is to analyze the key issues in university education in comparison to the classic or Humbolt's model. In the periods of extensive reforms of high education it is important to review the place and role of the university from the standpoint of institutional characteristics, a dominant educational orientation and attitudes towards society. The Bologna process initiated three key changes in the European system of university education: a the change of institutional framework - from the binary to the so called uniquely diversified system; b dominant orientation - instead of science the student is in the centre of education; c the social role of the university - from the development of science and impartial critique of the society towards providing educational services to the market. The pedagogic implications of these changes open the questions of the purpose of education, relations between professors and students and the identity of the modern university itself.

  1. New holographic scalar field models of dark energy in non-flat universe

    Energy Technology Data Exchange (ETDEWEB)

    Karami, K., E-mail: KKarami@uok.ac.i [Department of Physics, University of Kurdistan, Pasdaran St., Sanandaj (Iran, Islamic Republic of); Research Institute for Astronomy and Astrophysics of Maragha (RIAAM), Maragha (Iran, Islamic Republic of); Fehri, J. [Department of Physics, University of Kurdistan, Pasdaran St., Sanandaj (Iran, Islamic Republic of)

    2010-02-08

    Motivated by the work of Granda and Oliveros [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199], we generalize their work to the non-flat case. We study the correspondence between the quintessence, tachyon, K-essence and dilaton scalar field models with the new holographic dark energy model in the non-flat FRW universe. We reconstruct the potentials and the dynamics for these scalar field models, which describe accelerated expansion of the universe. In the limiting case of a flat universe, i.e. k=0, all results given in [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199] are obtained.

  2. New holographic scalar field models of dark energy in non-flat universe

    International Nuclear Information System (INIS)

    Karami, K.; Fehri, J.

    2010-01-01

    Motivated by the work of Granda and Oliveros [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199], we generalize their work to the non-flat case. We study the correspondence between the quintessence, tachyon, K-essence and dilaton scalar field models with the new holographic dark energy model in the non-flat FRW universe. We reconstruct the potentials and the dynamics for these scalar field models, which describe accelerated expansion of the universe. In the limiting case of a flat universe, i.e. k=0, all results given in [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199] are obtained.

  3. A Possible Universe in Pulsation by Using a Hydro-Dynamical Model for Gravity

    Directory of Open Access Journals (Sweden)

    Corneliu BERBENTE

    2016-12-01

    Full Text Available By using a hydro-dynamical model for gravity previously given by the author, a pulsating universe is possible to describe. This is possible because two hydro-dynamical sources are in attraction both when they are emitting and absorbing fluid. In our model, bodies (matter and energy are interacting via an incompressible fluid made of gravitons (photon-like particles having a wave length of the order of magnitude of the radius of universe. One considers the universe uniform at large scale, the effects of general relativity type being local and negligible at global scale. An “elastic sphere” model for the universe is suggested to describe the possible inversion. The expansion of the universe stops when the “elastic energy” overcomes the kinetic one; this takes place near the point of maximal emission speed of the fluid of gravitons. The differential equation for the universe in expansion is adapted to contraction. Analytical solutions are given.

  4. A time-symmetric Universe model and its observational implication

    International Nuclear Information System (INIS)

    Futamase, T.; Matsuda, T.

    1987-01-01

    A time-symmetric closed-universe model is discussed in terms of the radiation arrow of time. The time symmetry requires the occurrence of advanced waves in the recontracting phase of the Universe. The observational consequences of such advanced waves are considered, and it is shown that a test observer in the expanding phase can observe a time-reversed image of a source of radiation in the future recontracting phase

  5. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  6. Momasi Model in Need Assessment of Faculty Members of Alborz University

    Directory of Open Access Journals (Sweden)

    S. Esmaelzadeh

    2013-02-01

    Full Text Available Background: The first step in developing human resources to improve the performance of universities is to indentify accurate educational needs. Models may draw on a number of theories to help understand a particular problem in a certain setting or context. Momasi model is an integrated of the existing models in educational needs assessment field which has sufficient comprehensiveness of data collection. the aim of this study was application of Momasi model in need assessment of faculty members in seven areas duties. Methods: This study is a cross- sectional study which was formed based on Momasi model between34 faculty members of Alborz university. Results: Different areas of educational needs were respectively prioritized as: personal development, research, administrative and executive activities, education, health services and health promotion, and specialized activities outside the university. The most mean and standard deviation belong to area of research, The first priority in the area of research was the publications in English, in personal development area: familiarity with SPSS software ,and the area of education it was creativity nurture. Conclusion: Based on assessment results, research area in this needs assessment study has the most important priority and frequency. Therefore it is recommended that data gathered in research area section put in first priority for empowering for faculty members Of Alborz University.

  7. Universal monopole scaling near transitions from the Coulomb phase.

    Science.gov (United States)

    Powell, Stephen

    2012-08-10

    Certain frustrated systems, including spin ice and dimer models, exhibit a Coulomb phase at low temperatures, with power-law correlations and fractionalized monopole excitations. Transitions out of this phase, at which the effective gauge theory becomes confining, provide examples of unconventional criticality. This Letter studies the behavior at nonzero monopole density near such transitions, using scaling theory to arrive at universal expressions for the crossover phenomena. For a particular transition in spin ice, quantitative predictions are made by mapping to the XY model and confirmed using Monte Carlo simulations.

  8. Integrating an Interprofessional Education Model at a Private University

    Science.gov (United States)

    Parker, Ramona Ann; Gottlieb, Helmut; Dominguez, Daniel G.; Sanchez-Diaz, Patricia C.; Jones, Mary Elaine

    2015-01-01

    In 2012, a private University in South Texas sought to prepare eight cohorts of 25 nursing, optometry, pharmacy, physical therapy, and health care administration students with an interprofessional education activity as a model for collaborative learning. The two semester interprofessional activity used a blended model (Blackboard Learn®,…

  9. Quantitative Preparation in Doctoral Education Programs: A Mixed-Methods Study of Doctoral Student Perspectives on their Quantitative Training

    Directory of Open Access Journals (Sweden)

    Sarah L Ferguson

    2017-07-01

    Full Text Available Aim/Purpose: The purpose of the current study is to explore student perceptions of their own doctoral-level education and quantitative proficiency. Background: The challenges of preparing doctoral students in education have been discussed in the literature, but largely from the perspective of university faculty and program administrators. The current study directly explores the student voice on this issue. Methodology: Utilizing a sequential explanatory mixed-methods research design, the present study seeks to better understand doctoral-level education students’ perceptions of their quantitative methods training at a large public university in the southwestern United States. Findings: Results from both phases present the need for more application and consistency in doctoral-level quantitative courses. Additionally, there was a consistent theme of internal motivation in the responses, suggesting students perceive their quantitative training to be valuable beyond their personal interest in the topic. Recommendations for Practitioners: Quantitative methods instructors should emphasize practice in their quantitative courses and consider providing additional support for students through the inclusion of lab sections, tutoring, and/or differentiation. Pre-testing statistical ability at the start of a course is also suggested to better meet student needs. Impact on Society: The ultimate goal of quantitative methods in doctoral education is to produce high-quality educational researchers who are prepared to apply their knowledge to problems and research in education. Results of the present study can inform faculty and administrator decisions in doctoral education to best support this goal. Future Research: Using the student perspectives presented in the present study, future researchers should continue to explore effective instructional strategies and curriculum design within education doctoral programs. The inclusion of student voice can strengthen

  10. Portable University Model of the Atmosphere (PUMA)

    Energy Technology Data Exchange (ETDEWEB)

    Fraedrich, K.; Kirk, E.; Lunkeit, F. [Hamburg Univ. (Germany). Meteorologisches Inst.

    1998-10-01

    The Portable University Model of the Atmosphere (PUMA) is based on the Reading multi-level spectral model SGCM (Simple Global Circulation Model) described by Hoskins and Simmons (1975) and James and Gray (1986). Originally developed as a numerical prediction model, it was changed to perform as a circulation model. For example, James and Gray (1986) studied the influence of surface friction on the circulation of a baroclinic atmosphere, James and James (1992), and James et al. (1994) investigated ultra-low-frequency variability, and Mole and James (1990) analyzed the baroclinic adjustment in the context of a zonally varying flow. Frisius et al. (1998) simulated an idealized storm track by embedding a dipole structure in a zonally symmetric forcing field and Lunkeit et al. (1998) investigated the sensitivity of GCM (General Circulation Model) scenarios by an adaption technique applicapable to SGCMs. (orig.)

  11. A Model for the Expansion of the Universe

    Directory of Open Access Journals (Sweden)

    Silva N. P.

    2014-04-01

    Full Text Available One introduces an ansatz for the expansion factor a ( t = e ( H ( t t or our Universe in the spirit of the FLRW model; is a constant to be determined. Considering that the ingredients acting on the Universe expansion ( t > 4 10 12 s 1 : 3 10 are mainly matter (baryons plus dark matter and dark energy, one uses the current mea- sured values of Hubble constant H 0 , the Universe current age T 0 , matter density param- eter Ω m ( T 0 and dark energy parameter Ω ( T 0 together with the Friedmann equations to find = 0 : 5804 and that our Universe may have had a negative expansion accelera- tion up to the age T ⋆ = 3 : 214 G y r ( matter era and positive after that ( dark energy era , leading to an eternal expansion. An interaction between matter and dark energy is found to exist. The deceleration q ( t has been found to be q ( T ⋆ = 0 and q ( T 0 = -0.570.

  12. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  13. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  14. University Performance Management

    DEFF Research Database (Denmark)

    For the last two decades the Danish Universities have felt the impact of the international trend towards implementation of New Public Management. The results are seen in the implementation of new hierarchical governance structures and contractual governance system including market based quantitat......For the last two decades the Danish Universities have felt the impact of the international trend towards implementation of New Public Management. The results are seen in the implementation of new hierarchical governance structures and contractual governance system including market based...... quantitative measurement systems for resource allocation and performance evaluation. Compared to other countries the changes in performance measurements and governance of the Danish universities are radical and the Minister of Science heralded them as "the greatest change in university management since...... the founding of Copenhagen University in 1479". The changes took place with surprisingly little resistance from university scholars. The articles in this anthology investigate the origins and rationales for the silent managerial revolution at Danish Universities and the radical implications for the identity...

  15. An incremental procedure model for e-learning projects at universities

    Directory of Open Access Journals (Sweden)

    Pahlke, Friedrich

    2006-11-01

    Full Text Available E-learning projects at universities are produced under different conditions than in industry. The main characteristic of many university projects is that these are realized quasi in a solo effort. In contrast, in private industry the different, interdisciplinary skills that are necessary for the development of e-learning are typically supplied by a multimedia agency.A specific procedure tailored for the use at universities is therefore required to facilitate mastering the amount and complexity of the tasks.In this paper an incremental procedure model is presented, which describes the proceeding in every phase of the project. It allows a high degree of flexibility and emphasizes the didactical concept – instead of the technical implementation. In the second part, we illustrate the practical use of the theoretical procedure model based on the project “Online training in Genetic Epidemiology”.

  16. Walking the Walk: Modeling Social Model and Universal Design in the Disabilities Office

    Science.gov (United States)

    Thornton, Melanie; Downs, Sharon

    2010-01-01

    Making the shift from the medical model of disability to the social model requires postsecondary disabilities offices to carefully examine and revise policies and procedures to reflect this paradigm shift, which gives them the credibility to work toward such change on the campus level. The process followed by one university is covered in-depth, as…

  17. Cosmological models - in which universe do we live

    International Nuclear Information System (INIS)

    Hartvigsen, Y.

    1976-01-01

    A general discussion of the present state of cosmological models is introduced with a brief presentation of the expanding universe theory, the red shift and Hubble's Law. Hubble's Constant lies between 30 and 105 km/sec/Mpc, and a value of 55 km/sec/Mpc is assumed in this article. The arguments for the big bang and steady state theories are presented and the reasons for the present acceptance of the former given. Friedmann models are briefly discussed and 'universe density', rho, and 'space curvature',k, and the 'cosmological constant', Λ, are presented. These are shown on the Stabell-Refsdal diagram and the density parameter, sigma 0 , and the retardation parameter, q 0 , are related to Hubble's Constant. These parameters are then discussed and their values restricted such that the part of the Stabell-Refsdal diagram which is of interest may be defined. (JIW)

  18. QUANTITATIVE EXTRACTION OF MEIOFAUNA: A COMPARISON ...

    African Journals Online (AJOL)

    and A G DE WET. Department of Mathematical Statistics, University of Port Elizabeth. Accepted: May 1978. ABSTRACT. Two methods for the quantitative extraction of meiofauna from natural sandy sediments were investigated and compared: Cobb's decanting and sieving technique and the Oostenbrink elutriator. Both.

  19. Describing model of empowering managers by applying structural equation modeling: A case study of universities in Ardabil

    Directory of Open Access Journals (Sweden)

    Maryam Ghahremani Germi

    2015-06-01

    Full Text Available Empowerment is still on the agenda as a management concept and has become a widely used management term in the last decade or so. The purpose of this research was describing model of empowering managers by applying structural equation modeling (SEM at Ardabil universities. Two hundred and twenty managers of Ardabil universities including chancellors, managers, and vice presidents of education, research, and studies participated in this study. Clear and challenging goals, evaluation of function, access to resources, and rewarding were investigated. The results indicated that the designed SEM for empowering managers at university reflects a good fitness level. As it stands out, the conceptual model in the society under investigation was used appropriately. Among variables, access to resources with 88 per cent of load factor was known as the affective variable. Evaluation of function containing 51 per cent of load factor was recognized to have less effect. Results of average rating show that evaluation of function and access to resources with 2.62 coefficients stand at first level. Due to this, they had great impact on managers' empowerment. The results of the analysis provided compelling evidence that model of empowering managers was desirable at Ardabil universities.

  20. Universal model of finite Reynolds number turbulent flow in channels and pipes

    NARCIS (Netherlands)

    L'vov, V.S.; Procaccia, I.; Rudenko, O.

    2008-01-01

    In this Letter, we suggest a simple and physically transparent analytical model of pressure driven turbulent wall-bounded flows at high but finite Reynolds numbers Re. The model provides an accurate quantitative description of the profiles of the mean-velocity and Reynolds stresses (second order

  1. Universe before Planck time: A quantum gravity model

    International Nuclear Information System (INIS)

    Padmanabhan, T.

    1983-01-01

    A model for quantum gravity can be constructed by treating the conformal degree of freedom of spacetime as a quantum variable. An isotropic, homogeneous cosmological solution in this quantum gravity model is presented. The spacetime is nonsingular for all the three possible values of three-space curvature, and agrees with the classical solution for time scales larger than the Planck time scale. A possibility of quantum fluctuations creating the matter in the universe is suggested

  2. Predictive models for suicidal thoughts and behaviors among Spanish University students: rationale and methods of the UNIVERSAL (University & mental health) project.

    Science.gov (United States)

    Blasco, Maria Jesús; Castellví, Pere; Almenara, José; Lagares, Carolina; Roca, Miquel; Sesé, Albert; Piqueras, José Antonio; Soto-Sanz, Victoria; Rodríguez-Marín, Jesús; Echeburúa, Enrique; Gabilondo, Andrea; Cebrià, Ana Isabel; Miranda-Mendizábal, Andrea; Vilagut, Gemma; Bruffaerts, Ronny; Auerbach, Randy P; Kessler, Ronald C; Alonso, Jordi

    2016-05-04

    Suicide is a leading cause of death among young people. While suicide prevention is considered a research and intervention priority, longitudinal data is needed to identify risk and protective factors associate with suicidal thoughts and behaviors. Here we describe the UNIVERSAL (University and Mental Health) project which aims are to: (1) test prevalence and 36-month incidence of suicidal thoughts and behaviors; and (2) identify relevant risk and protective factors associated with the incidence of suicidal thoughts and behaviors among university students in Spain. An ongoing multicenter, observational, prospective cohort study of first year university students in 5 Spanish universities. Students will be assessed annually during a 36 month follow-up. The surveys will be administered through an online, secure web-based platform. A clinical reappraisal will be completed among a subsample of respondents. Suicidal thoughts and behaviors will be assess with the Self-Injurious Thoughts and Behaviors Interview (SITBI) and the Columbia-Suicide Severity Rating Scale (C-SSRS). Risk and protective factors will include: mental disorders, measured with the Composite International Diagnostic Interview version 3.0 (CIDI 3.0) and Screening Scales (CIDI-SC), and the Epi-Q Screening Survey (EPI-Q-SS), socio-demographic variables, self-perceived health status, health behaviors, well-being, substance use disorders, service use and treatment. The UNIVERSAL project is part of the International College Surveys initiative, which is a core project within the World Mental Health consortium. Lifetime and the 12-month prevalence will be calculated for suicide ideation, plans and attempts. Cumulative incidence of suicidal thoughts and behaviors, and mental disorders will be measured using the actuarial method. Risk and protective factors of suicidal thoughts and behaviors will be analyzed by Cox proportional hazard models. The study will provide valid, innovative and useful data for developing

  3. Analysis of genetic effects of nuclear-cytoplasmic interaction on quantitative traits: genetic model for diploid plants.

    Science.gov (United States)

    Han, Lide; Yang, Jian; Zhu, Jun

    2007-06-01

    A genetic model was proposed for simultaneously analyzing genetic effects of nuclear, cytoplasm, and nuclear-cytoplasmic interaction (NCI) as well as their genotype by environment (GE) interaction for quantitative traits of diploid plants. In the model, the NCI effects were further partitioned into additive and dominance nuclear-cytoplasmic interaction components. Mixed linear model approaches were used for statistical analysis. On the basis of diallel cross designs, Monte Carlo simulations showed that the genetic model was robust for estimating variance components under several situations without specific effects. Random genetic effects were predicted by an adjusted unbiased prediction (AUP) method. Data on four quantitative traits (boll number, lint percentage, fiber length, and micronaire) in Upland cotton (Gossypium hirsutum L.) were analyzed as a worked example to show the effectiveness of the model.

  4. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  5. Quantitative stress measurement of elastic deformation using mechanoluminescent sensor: An intensity ratio model

    Science.gov (United States)

    Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng

    2018-04-01

    The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.

  6. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong; Zhao, Weishu; Chang, Frank; Dyer, Steve

    2013-01-01

    Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  7. [The strategic research areas of a University Hospital: proposal of a quali-quantitative method.

    Science.gov (United States)

    Iezzi, Elisa; Ardissino, Diego; Ferrari, Carlo; Vitale, Marco; Caminiti, Caterina

    2018-02-01

    This work aimed to objectively identify the main research areas at the University Hospital of Parma. To this end, a multidisciplinary working group, comprising clinicians, researchers, and hospital management, was formed to develop a shared quali-quantitative method. Easily retrievable performance indicators were selected from the literature (concerning bibliometric data and grant acquisition), and a scoring system developed to assign weights to each indicator. Subsequently, Research Team Leaders were identified from the hospital's "Research Plan", a document produced every three years which contains information on the main research themes carried out at each Department, involved staff and available resources, provided by health care professionals themselves. The selected performance indicators were measured for each Team Leader, and scores assigned, thus creating a ranking list. Through the analyses of the research themes of top Team Leaders, the Working Group identified the following five strategic research areas: (a) personalized treatment in oncology and hematology; (b) chronicization mechanisms in immunomediate diseases; (c) old and new risk factors for cardiovascular diseases; (d) nutritional disorders, metabolic and chronic-degenerative diseases; (e) molecular diagnostic and predictive markers. We have developed an objective method to identify a hospital's main research areas. Its application can guide resource allocation and can offer ways to value the work of professionals involved in research.

  8. Cloud Computing Adoption Model for Universities to Increase ICT Proficiency

    Directory of Open Access Journals (Sweden)

    Safiya Okai

    2014-08-01

    Full Text Available Universities around the world especially those in developing countries are faced with the problem of delivering the level of information and communications technology (ICT needed to facilitate teaching, learning, research, and development activities ideal in a typical university, which is needed to meet educational needs in-line with advancement in technology and the growing dependence on IT. This is mainly due to the high cost involved in providing and maintaining the needed hardware and software. A technology such as cloud computing that delivers on demand provisioning of IT resources on a pay per use basis can be used to address this problem. Cloud computing promises better delivery of IT services as well as availability whenever and wherever needed at reduced costs with users paying only as much as they consume through the services of cloud service providers. The cloud technology reduces complexity while increasing speed and quality of IT services provided; however, despite these benefits the challenges that come with its adoption have left many sectors especially the higher education skeptical in committing to this technology. This article identifies the reasons for the slow rate of adoption of cloud computing at university level, discusses the challenges faced and proposes a cloud computing adoption model that contains strategic guidelines to overcome the major challenges identified and a roadmap for the successful adoption of cloud computing by universities. The model was tested in one of the universities and found to be both useful and appropriate for adopting cloud computing at university level.

  9. Using Predictive Modelling to Identify Students at Risk of Poor University Outcomes

    Science.gov (United States)

    Jia, Pengfei; Maloney, Tim

    2015-01-01

    Predictive modelling is used to identify students at risk of failing their first-year courses and not returning to university in the second year. Our aim is twofold. Firstly, we want to understand the factors that lead to poor first-year experiences at university. Secondly, we want to develop simple, low-cost tools that would allow universities to…

  10. A Model of Nonsingular Universe

    Directory of Open Access Journals (Sweden)

    Changjun Gao

    2012-07-01

    Full Text Available In the background of Friedmann–Robertson–Walker Universe, there exists Hawking radiation which comes from the cosmic apparent horizon due to quantum effect. Although the Hawking radiation on the late time evolution of the universe could be safely neglected, it plays an important role in the very early stage of the universe. In view of this point, we identify the temperature in the scalar field potential with the Hawking temperature of cosmic apparent horizon. Then we find a nonsingular universe sourced by the temperature-dependent scalar field. We find that the universe could be created from a de Sitter phase which has the Planck energy density. Thus the Big-Bang singularity is avoided.

  11. Gravitino in the early Universe. A model of extra-dimension and a model of dark matter

    International Nuclear Information System (INIS)

    Gherson, D.

    2007-10-01

    This work can be related to the Horava-Witten M-theory in which the Universe could appear 5 dimensional at a stage of its evolution but also to theories of Baryogenesis through Lepto-genesis which imply high reheating temperatures after Inflation. The studied cosmological model is within the framework of a 5 dimensional supergravity with the extra-dimension compactified on an orbifold circle, where the matter and gauge field are located on one of the two branes localised at the orbifold fixed points and where the supergravity fields can propagate in the whole spatial dimensions. In the model, the Dark matter is made of neutralino which is supposed to be the lightest supersymmetric particle. We have shown that there are curves of constraints between the size of the extra-dimension and the reheating temperature of the Universe after Inflation. The constraints come from the measurements of the amount of Dark matter in the Universe and from the model of the Big Bang Nucleosynthesis of light elements. (author)

  12. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    Science.gov (United States)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  13. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    NARCIS (Netherlands)

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  14. Reconstructing an interacting holographic polytropic gas model in a non-flat FRW universe

    International Nuclear Information System (INIS)

    Karami, K; Abdolmaleki, A

    2010-01-01

    We study the correspondence between the interacting holographic dark energy and the polytropic gas model of dark energy in a non-flat FRW universe. This correspondence allows one to reconstruct the potential and the dynamics for the scalar field of the polytropic model, which describe accelerated expansion of the universe.

  15. Reconstructing an interacting holographic polytropic gas model in a non-flat FRW universe

    Energy Technology Data Exchange (ETDEWEB)

    Karami, K; Abdolmaleki, A, E-mail: KKarami@uok.ac.i [Department of Physics, University of Kurdistan, Pasdaran Street, Sanandaj (Iran, Islamic Republic of)

    2010-05-01

    We study the correspondence between the interacting holographic dark energy and the polytropic gas model of dark energy in a non-flat FRW universe. This correspondence allows one to reconstruct the potential and the dynamics for the scalar field of the polytropic model, which describe accelerated expansion of the universe.

  16. EXPENSES FORECASTING MODEL IN UNIVERSITY PROJECTS PLANNING

    Directory of Open Access Journals (Sweden)

    Sergei A. Arustamov

    2016-11-01

    Full Text Available The paper deals with mathematical model presentation of cash flows in project funding. We describe different types of expenses linked to university project activities. Problems of project budgeting that contribute most uncertainty have been revealed. As an example of the model implementation we consider calculation of vacation allowance expenses for project participants. We define problems of forecast for funds reservation: calculation based on methodology established by the Ministry of Education and Science calculation according to the vacation schedule and prediction of the most probable amount. A stochastic model for vacation allowance expenses has been developed. We have proposed methods and solution of the problems that increase the accuracy of forecasting for funds reservation based on 2015 data.

  17. Modelling the implications of moving towards universal coverage in Tanzania.

    Science.gov (United States)

    Borghi, Josephine; Mtei, Gemini; Ally, Mariam

    2012-03-01

    A model was developed to assess the impact of possible moves towards universal coverage in Tanzania over a 15-year time frame. Three scenarios were considered: maintaining the current situation ('the status quo'); expanded health insurance coverage (the estimated maximum achievable coverage in the absence of premium subsidies, coverage restricted to those who can pay); universal coverage to all (government revenues used to pay the premiums for the poor). The model estimated the costs of delivering public health services and all health services to the population as a proportion of Gross Domestic Product (GDP), and forecast revenue from user fees and insurance premiums. Under the status quo, financial protection is provided to 10% of the population through health insurance schemes, with the remaining population benefiting from subsidized user charges in public facilities. Seventy-six per cent of the population would benefit from financial protection through health insurance under the expanded coverage scenario, and 100% of the population would receive such protection through a mix of insurance cover and government funding under the universal coverage scenario. The expanded and universal coverage scenarios have a significant effect on utilization levels, especially for public outpatient care. Universal coverage would require an initial doubling in the proportion of GDP going to the public health system. Government health expenditure would increase to 18% of total government expenditure. The results are sensitive to the cost of health system strengthening, the level of real GDP growth, provider reimbursement rates and administrative costs. Promoting greater cross-subsidization between insurance schemes would provide sufficient resources to finance universal coverage. Alternately, greater tax funding for health could be generated through an increase in the rate of Value-Added Tax (VAT) or expanding the income tax base. The feasibility and sustainability of efforts to

  18. Enhancing the Quantitative Representation of Socioeconomic Conditions in the Shared Socio-economic Pathways (SSPs) using the International Futures Model

    Science.gov (United States)

    Rothman, D. S.; Siraj, A.; Hughes, B.

    2013-12-01

    The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.

  19. Refining the statistical model for quantitative immunostaining of surface-functionalized nanoparticles by AFM.

    Science.gov (United States)

    MacCuspie, Robert I; Gorka, Danielle E

    2013-10-01

    Recently, an atomic force microscopy (AFM)-based approach for quantifying the number of biological molecules conjugated to a nanoparticle surface at low number densities was reported. The number of target molecules conjugated to the analyte nanoparticle can be determined with single nanoparticle fidelity using antibody-mediated self-assembly to decorate the analyte nanoparticles with probe nanoparticles (i.e., quantitative immunostaining). This work refines the statistical models used to quantitatively interpret the observations when AFM is used to image the resulting structures. The refinements add terms to the previous statistical models to account for the physical sizes of the analyte nanoparticles, conjugated molecules, antibodies, and probe nanoparticles. Thus, a more physically realistic statistical computation can be implemented for a given sample of known qualitative composition, using the software scripts provided. Example AFM data sets, using horseradish peroxidase conjugated to gold nanoparticles, are presented to illustrate how to implement this method successfully.

  20. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    Science.gov (United States)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  1. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  2. The Emerging Workforce of International University Student Workers: Injury Experience in an Australian University

    Directory of Open Access Journals (Sweden)

    Yahya Thamrin

    2018-03-01

    Full Text Available International university students are a growing section of the workforce and are thought to be at greater risk of injury. Qualitative studies have highlighted vulnerabilities, but there is a shortage of quantitative research exploring the injury experience and associated risk factors of this emerging issue. In this study, a total of 466 university student workers across a range of study programs in a single Australian university completed an online survey, with questions relating to their background, working experience, training and injury experience. Risk factors for injury were explored in a multivariate statistical model. More than half had not received any safety training before they started work, and 10% reported having had a work injury. About half of these injuries occurred after training. Statistically significant risk factors for injury included working more than 20 h per week (adjusted odds ratio 2.20 (95% CI 1.03–4.71 and lack of confidence in discussing safety issues (AOR 2.17; 95% CI 1.13–4.16. The findings suggest the need for a more engaging and effective approach to safety education and a limit on working hours. This situation is a moral challenge for universities, in that they are effectively sponsoring young workers in the community. It is recommended that longitudinal studies of international student workers be conducted.

  3. Universal Regularizers For Robust Sparse Coding and Modeling

    OpenAIRE

    Ramirez, Ignacio; Sapiro, Guillermo

    2010-01-01

    Sparse data models, where data is assumed to be well represented as a linear combination of a few elements from a dictionary, have gained considerable attention in recent years, and their use has led to state-of-the-art results in many signal and image processing tasks. It is now well understood that the choice of the sparsity regularization term is critical in the success of such models. Based on a codelength minimization interpretation of sparse coding, and using tools from universal coding...

  4. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  5. Game Based Learning (GBL) adoption model for universities: cesim ...

    African Journals Online (AJOL)

    Game Based Learning (GBL) adoption model for universities: cesim simulation. ... The global market has escalated the need of Game Based Learning (GBL) to offer a wide range of courses since there is a ... AJOL African Journals Online.

  6. Modeling Environmental Literacy of Malaysian Pre-University Students

    Science.gov (United States)

    Shamuganathan, Sheila; Karpudewan, Mageswary

    2015-01-01

    In this study attempt was made to model the environmental literacy of Malaysian pre-university students enrolled in a matriculation college. Students enrolled in the matriculation colleges in Malaysia are the top notch students in the country. Environmental literacy of this group is perceived important because in the future these students will be…

  7. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  8. A 3 + 1 Regge calculus model of the Taub universe

    International Nuclear Information System (INIS)

    Tuckey, P.A.

    1988-01-01

    The Piran and Williams [1986 Phys. Rev. D 33,1622] second-order formulation of 3 + 1 Regge calculus is used to calculate the evolution of a model of the Taub universe. The model displays qualitatively the correct behaviour, thereby giving some verification of the 3 + 1 formulation. (author)

  9. Qualitative and quantitative guidelines for the comparison of environmental model predictions

    International Nuclear Information System (INIS)

    Scott, M.

    1995-03-01

    The question of how to assess or compare predictions from a number of models is one of concern in the validation of models, in understanding the effects of different models and model parameterizations on model output, and ultimately in assessing model reliability. Comparison of model predictions with observed data is the basic tool of model validation while comparison of predictions amongst different models provides one measure of model credibility. The guidance provided here is intended to provide qualitative and quantitative approaches (including graphical and statistical techniques) to such comparisons for use within the BIOMOVS II project. It is hoped that others may find it useful. It contains little technical information on the actual methods but several references are provided for the interested reader. The guidelines are illustrated on data from the VAMP CB scenario. Unfortunately, these data do not permit all of the possible approaches to be demonstrated since predicted uncertainties were not provided. The questions considered are concerned with a) intercomparison of model predictions and b) comparison of model predictions with the observed data. A series of examples illustrating some of the different types of data structure and some possible analyses have been constructed. A bibliography of references on model validation is provided. It is important to note that the results of the various techniques discussed here, whether qualitative or quantitative, should not be considered in isolation. Overall model performance must also include an evaluation of model structure and formulation, i.e. conceptual model uncertainties, and results for performance measures must be interpreted in this context. Consider a number of models which are used to provide predictions of a number of quantities at a number of time points. In the case of the VAMP CB scenario, the results include predictions of total deposition of Cs-137 and time dependent concentrations in various

  10. [Diagnostic value of quantitative pharmacokinetic parameters and relative quantitative pharmacokinetic parameters in breast lesions with dynamic contrast-enhanced MRI].

    Science.gov (United States)

    Sun, T T; Liu, W H; Zhang, Y Q; Li, L H; Wang, R; Ye, Y Y

    2017-08-01

    Objective: To explore the differential between the value of dynamic contrast-enhanced MRI quantitative pharmacokinetic parameters and relative pharmacokinetic quantitative parameters in breast lesions. Methods: Retrospective analysis of 255 patients(262 breast lesions) who was obtained by clinical palpation , ultrasound or full-field digital mammography , and then all lessions were pathologically confirmed in Zhongda Hospital, Southeast University from May 2012 to May 2016. A 3.0 T MRI scanner was used to obtain the quantitative MR pharmacokinetic parameters: volume transfer constant (K(trans)), exchange rate constant (k(ep))and extravascular extracellular volume fraction (V(e)). And measured the quantitative pharmacokinetic parameters of normal glands tissues which on the same side of the same level of the lesions; and then calculated the value of relative pharmacokinetic parameters: rK(rans)、rk(ep) and rV(e).To explore the diagnostic value of two pharmacokinetic parameters in differential diagnosis of benign and malignant breast lesions using receiver operating curves and model of logistic regression. Results: (1)There were significant differences between benign lesions and malignant lesions in K(trans) and k(ep) ( t =15.489, 15.022, respectively, P 0.05). The areas under the ROC curve(AUC)of K(trans), k(ep) and V(e) between malignant and benign lesions were 0.933, 0.948 and 0.387, the sensitivity of K(trans), k(ep) and V(e) were 77.1%, 85.0%, 51.0% , and the specificity of K(trans), k(ep) and V(e) were 96.3%, 93.6%, 60.8% for the differential diagnosis of breast lesions if taken the maximum Youden's index as cut-off. (2)There were significant differences between benign lesions and malignant lesions in rK(trans), rk(ep) and rV(e) ( t =14.177, 11.726, 2.477, respectively, P quantitative pharmacokinetic parameters and the prediction probability of relative quantitative pharmacokinetic parameters( Z =0.867, P =0.195). Conclusion: There was no significant

  11. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  12. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China*

    Science.gov (United States)

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-01-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749

  13. Fixing the cracks in the crystal ball: A maturity model for quantitative risk assessment

    International Nuclear Information System (INIS)

    Rae, Andrew; Alexander, Rob; McDermid, John

    2014-01-01

    Quantitative risk assessment (QRA) is widely practiced in system safety, but there is insufficient evidence that QRA in general is fit for purpose. Defenders of QRA draw a distinction between poor or misused QRA and correct, appropriately used QRA, but this distinction is only useful if we have robust ways to identify the flaws in an individual QRA. In this paper we present a comprehensive maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature and in a collection of risk assessment peer reviews. We provide initial validation of the completeness and realism of the model. Our risk assessment maturity model provides a way to prioritise both process development within an organisation and empirical research within the QRA community. - Highlights: • Quantitative risk assessment (QRA) is widely practiced, but there is insufficient evidence that it is fit for purpose. • A given QRA may be good, or it may not – we need systematic ways to distinguish this. • We have created a maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature. • We have provided initial validation of the completeness and realism of the model. • The maturity model can also be used to prioritise QRA research discipline-wide

  14. Discussions on the non-equilibrium effects in the quantitative phase field model of binary alloys

    International Nuclear Information System (INIS)

    Zhi-Jun, Wang; Jin-Cheng, Wang; Gen-Cang, Yang

    2010-01-01

    All the quantitative phase field models try to get rid of the artificial factors of solutal drag, interface diffusion and interface stretch in the diffuse interface. These artificial non-equilibrium effects due to the introducing of diffuse interface are analysed based on the thermodynamic status across the diffuse interface in the quantitative phase field model of binary alloys. Results indicate that the non-equilibrium effects are related to the negative driving force in the local region of solid side across the diffuse interface. The negative driving force results from the fact that the phase field model is derived from equilibrium condition but used to simulate the non-equilibrium solidification process. The interface thickness dependence of the non-equilibrium effects and its restriction on the large scale simulation are also discussed. (cross-disciplinary physics and related areas of science and technology)

  15. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  16. Quantitative model for the blood pressure‐lowering interaction of valsartan and amlodipine

    Science.gov (United States)

    Heo, Young‐A; Holford, Nick; Kim, Yukyung; Son, Mijeong

    2016-01-01

    Aims The objective of this study was to develop a population pharmacokinetic (PK) and pharmacodynamic (PD) model to quantitatively describe the antihypertensive effect of combined therapy with amlodipine and valsartan. Methods PK modelling was used with data collected from 48 healthy volunteers receiving a single dose of combined formulation of 10 mg amlodipine and 160 mg valsartan. Systolic (SBP) and diastolic blood pressure (DBP) were recorded during combined administration. SBP and DBP data for each drug alone were gathered from the literature. PKPD models of each drug and for combined administration were built with NONMEM 7.3. Results A two‐compartment model with zero order absorption best described the PK data of both drugs. Amlodipine and valsartan monotherapy effects on SBP and DBP were best described by an I max model with an effect compartment delay. Combined therapy was described using a proportional interaction term as follows: (D1 + D2) +ALPHA×(D1 × D2). D1 and D2 are the predicted drug effects of amlodipine and valsartan monotherapy respectively. ALPHA is the interaction term for combined therapy. Quantitative estimates of ALPHA were −0.171 (95% CI: −0.218, −0.143) for SBP and −0.0312 (95% CI: −0.07739, −0.00283) for DBP. These infra‐additive interaction terms for both SBP and DBP were consistent with literature results for combined administration of drugs in these classes. Conclusion PKPD models for SBP and DBP successfully described the time course of the antihypertensive effects of amlodipine and valsartan. An infra‐additive interaction between amlodipine and valsartan when used in combined administration was confirmed and quantified. PMID:27504853

  17. Entropy - Some Cosmological Questions Answered by Model of Expansive Nondecelerative Universe

    Directory of Open Access Journals (Sweden)

    Miroslav Sukenik

    2003-01-01

    Full Text Available Abstract: The paper summarizes the background of Expansive Nondecelerative Universe model and its potential to offer answers to some open cosmological questions related to entropy. Three problems are faced in more detail, namely that of Hawkings phenomenon of black holes evaporation, maximum entropy of the Universe during its evolution, and time evolution of specific entropy.

  18. ICT Adoption Impact on Students’ Academic Performance: Evidence from Saudi Universities

    Directory of Open Access Journals (Sweden)

    Wael Sh. Basri

    2018-01-01

    Full Text Available This study investigates and explores the adoption of information communication technology by the universities and the impact it makes on the university students’ academic performance. The study also examines the moderators’ effect of gender, GPA, and student majors on the relationship between ICT and academic achievement. By using a quantitative research approach and a sample size of 1000 students, data were collected about the ICT adoption in universities and the relative performance of students belonging to four Saudi universities. Structure equation modelling was chosen to determine the validity of the research model. The Analysis of Moment Structures (AMOS, specially used for structural equation modelling and path analysis, was used as the research tool. The findings reveal that there exists a relationship between ICT adoption and academic performance in a conservative environment. An additional finding also stated that ICT adoption resulted in the improvement of the performance of female students more than the male. However, students’ IT major was found to be making no impact on students’ academic achievement. A discussion of findings, limitations, and suggestions for future research has been provided in the study. Finally, it also provides implications of the current study to the existing knowledge.

  19. Exploring the common molecular basis for the universal DNA mutation bias: Revival of Loewdin mutation model

    International Nuclear Information System (INIS)

    Fu, Liang-Yu; Wang, Guang-Zhong; Ma, Bin-Guang; Zhang, Hong-Yu

    2011-01-01

    Highlights: → There exists a universal G:C → A:T mutation bias in three domains of life. → This universal mutation bias has not been sufficiently explained. → A DNA mutation model proposed by Loewdin 40 years ago offers a common explanation. -- Abstract: Recently, numerous genome analyses revealed the existence of a universal G:C → A:T mutation bias in bacteria, fungi, plants and animals. To explore the molecular basis for this mutation bias, we examined the three well-known DNA mutation models, i.e., oxidative damage model, UV-radiation damage model and CpG hypermutation model. It was revealed that these models cannot provide a sufficient explanation to the universal mutation bias. Therefore, we resorted to a DNA mutation model proposed by Loewdin 40 years ago, which was based on inter-base double proton transfers (DPT). Since DPT is a fundamental and spontaneous chemical process and occurs much more frequently within GC pairs than AT pairs, Loewdin model offers a common explanation for the observed universal mutation bias and thus has broad biological implications.

  20. Universal Scaling and Critical Exponents of the Anisotropic Quantum Rabi Model

    Science.gov (United States)

    Liu, Maoxin; Chesi, Stefano; Ying, Zu-Jian; Chen, Xiaosong; Luo, Hong-Gang; Lin, Hai-Qing

    2017-12-01

    We investigate the quantum phase transition of the anisotropic quantum Rabi model, in which the rotating and counterrotating terms are allowed to have different coupling strengths. The model interpolates between two known limits with distinct universal properties. Through a combination of analytic and numerical approaches, we extract the phase diagram, scaling functions, and critical exponents, which determine the universality class at finite anisotropy (identical to the isotropic limit). We also reveal other interesting features, including a superradiance-induced freezing of the effective mass and discontinuous scaling functions in the Jaynes-Cummings limit. Our findings are extended to the few-body quantum phase transitions with N >1 spins, where we expose the same effective parameters, scaling properties, and phase diagram. Thus, a stronger form of universality is established, valid from N =1 up to the thermodynamic limit.

  1. Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.

    Science.gov (United States)

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.

  2. Research universities for the 21st century

    Energy Technology Data Exchange (ETDEWEB)

    Gover, J. [Sandia National Labs., Albuquerque, NM (United States); Huray, P.G. [Univ. of South Carolina, Columbia, SC (United States)

    1998-05-01

    The `public outcomes` from research universities are educated students and research that extends the frontiers of knowledge. Measures of these `public outcomes` are inadequate to permit either research or education consumers to select research universities based on quantitative performance data. Research universities annually spend over $20 billion on research; 60% of these funds are provided by Federal sources. Federal funding for university research has recently grown at an annual rate near 6% during a time period when other performers of Federal research have experienced real funding cuts. Ten universities receive about 25% of the Federal funds spent on university research. Numerous studies of US research universities are reporting storm clouds. Concerns include balancing research and teaching, the narrow focus of engineering education, college costs, continuing education, and public funding of foreign student education. The absence of research on the `public outcomes` from university research results in opinion, politics, and mythology forming the basis of too many decisions. Therefore, the authors recommend studies of other nations` research universities, studies of various economic models of university research, analysis of the peer review process and how well it identifies the most capable research practitioners and at what cost, and studies of research university ownership of intellectual property that can lead to increased `public outcomes` from publicly-funded research performed by research universities. They advocate two practices that could increase the `public outcomes` from university research. These are the development of science roadmaps that link science research to `public outcomes` and `public outcome` metrics. Changes in the university research culture and expanded use of the Internet could also lead to increased `public outcomes`. They recommend the use of tax incentives to encourage companies to develop research partnerships with research

  3. Quantifying the levitation picture of extended states in lattice models

    OpenAIRE

    Pereira, Ana. L. C.; Schulz, P. A.

    2002-01-01

    The behavior of extended states is quantitatively analyzed for two-dimensional lattice models. A levitation picture is established for both white-noise and correlated disorder potentials. In a continuum limit window of the lattice models we find simple quantitative expressions for the extended states levitation, suggesting an underlying universal behavior. On the other hand, these results point out that the quantum Hall phase diagrams may be disorder dependent.

  4. Cosmological Imprints of a Generalized Chaplygin Gas Model for the Early Universe

    Energy Technology Data Exchange (ETDEWEB)

    Bouhmadi-Lopez, Mariam; /Lisbon, CENTRA; Chen, Pisin; /Taiwan, Natl. Taiwan U. /KIPAC, Menlo Park /SLAC; Liu, Yen-Wei; /Taiwan, Natl. Taiwan U.

    2012-06-06

    We propose a phenomenological model for the early universe where there is a smooth transition between an early quintessence phase and a radiation-dominated era. The matter content is modeled by an appropriately modified Chaplygin gas for the early universe. We constrain the model observationally by mapping the primordial power spectrum of the scalar perturbations to the latest data of WMAP7. We compute as well the spectrum of the primordial gravitational waves as would be measured today. We show that the high frequencies region of the spectrum depends on the free parameter of the model and most importantly this region of the spectrum can be within the reach of future gravitational waves detectors.

  5. Towards Quantitative Spatial Models of Seabed Sediment Composition.

    Directory of Open Access Journals (Sweden)

    David Stephens

    Full Text Available There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom's parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method.

  6. Geometrothermodynamic model for the evolution of the Universe

    Energy Technology Data Exchange (ETDEWEB)

    Gruber, Christine; Quevedo, Hernando, E-mail: christine.gruber@correo.nucleares.unam.mx, E-mail: quevedo@nucleares.unam.mx [Instituto de Ciencias Nucleares, Universidad Nacional Autónoma de México, AP 70543, México, DF 04510 (Mexico)

    2017-07-01

    Using the formalism of geometrothermodynamics to derive a fundamental thermodynamic equation, we construct a cosmological model in the framework of relativistic cosmology. In a first step, we describe a system without thermodynamic interaction, and show it to be equivalent to the standard ΛCDM paradigm. The second step includes thermodynamic interaction and produces a model consistent with the main features of inflation. With the proposed fundamental equation we are thus able to describe all the known epochs in the evolution of our Universe, starting from the inflationary phase.

  7. Space-Time Uncertainty and Cosmology: a Proposed Quantum Model of the Universe [ 245Kb

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-10-01

    Full Text Available The paper introduces a cosmological model of the quantum universe. The aim of the model is (i to identify the possible mechanism that governs the matter/antimatter ratio existing in the universe and concurrently to propose (ii a reasonable growth mechanism of the universe and (iii a possible explanation of the dark energy. The concept of timespace uncertainty, on which is based the present quantum approach, has been proven able to bridge quantum mechanics and relativity.

  8. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  9. The Analysis of Organizational Diagnosis on Based Six Box Model in Universities

    Science.gov (United States)

    Hamid, Rahimi; Siadat, Sayyed Ali; Reza, Hoveida; Arash, Shahin; Ali, Nasrabadi Hasan; Azizollah, Arbabisarjou

    2011-01-01

    Purpose: The analysis of organizational diagnosis on based six box model at universities. Research method: Research method was descriptive-survey. Statistical population consisted of 1544 faculty members of universities which through random strafed sampling method 218 persons were chosen as the sample. Research Instrument were organizational…

  10. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    Science.gov (United States)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative

  11. Review of progress in quantitative nondestructive evaluation

    International Nuclear Information System (INIS)

    Thompson, D.O.; Chimenti, D.E.

    1983-01-01

    A comprehensive review of the current state of quantitative nondestructive evaluation (NDE), this volume brings together papers by researchers working in government, private industry, and university laboratories. Their papers cover a wide range of interests and concerns for researchers involved in theoretical and applied aspects of quantitative NDE. Specific topics examined include reliability probability of detection--ultrasonics and eddy currents weldments closure effects in fatigue cracks technology transfer ultrasonic scattering theory acoustic emission ultrasonic scattering, reliability and penetrating radiation metal matrix composites ultrasonic scattering from near-surface flaws ultrasonic multiple scattering

  12. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    Science.gov (United States)

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  13. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity.

  14. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity

  15. Universal Fragment Descriptors for Predicting Electronic and Mechanical Properties of Inorganic Crystals

    Science.gov (United States)

    Oses, Corey; Isayev, Olexandr; Toher, Cormac; Curtarolo, Stefano; Tropsha, Alexander

    Historically, materials discovery is driven by a laborious trial-and-error process. The growth of materials databases and emerging informatics approaches finally offer the opportunity to transform this practice into data- and knowledge-driven rational design-accelerating discovery of novel materials exhibiting desired properties. By using data from the AFLOW repository for high-throughput, ab-initio calculations, we have generated Quantitative Materials Structure-Property Relationship (QMSPR) models to predict critical materials properties, including the metal/insulator classification, band gap energy, and bulk modulus. The prediction accuracy obtained with these QMSPR models approaches training data for virtually any stoichiometric inorganic crystalline material. We attribute the success and universality of these models to the construction of new materials descriptors-referred to as the universal Property-Labeled Material Fragments (PLMF). This representation affords straightforward model interpretation in terms of simple heuristic design rules that could guide rational materials design. This proof-of-concept study demonstrates the power of materials informatics to dramatically accelerate the search for new materials.

  16. A Tuned Value Chain Model for University Based Public Research Organisation. Case Lut Cst.

    Directory of Open Access Journals (Sweden)

    Vesa Karvonen

    2012-12-01

    Full Text Available The Porter´s value chain model was introduced for strategic business purposes. During the last decades also Universities and University based institutes have started to use actions similar to private business concepts. A University based institute is not independent actor like company but there are interest groups who are expecting them to act like they would be. This article discusses about the possibility of utilize tuned value chain to public research organizations (PRO. Also the interactions of tuned value chain model to existing industrial network are discussed. The case study object is the Centre for Separation Technology (CST at Lappeenranta University of Technology (LUT in Finland.

  17. Modeling strength loss in wood by chemical composition. Part I, An individual component model for southern pine

    Science.gov (United States)

    J. E. Winandy; P. K. Lebow

    2001-01-01

    In this study, we develop models for predicting loss in bending strength of clear, straight-grained pine from changes in chemical composition. Although significant work needs to be done before truly universal predictive models are developed, a quantitative fundamental relationship between changes in chemical composition and strength loss for pine was demonstrated. In...

  18. A SWOT Analysis of the Integration of E-Learning at a University in Uganda and a University in Tanzania

    Science.gov (United States)

    Zhu, Chang; Justice Mugenyi, Kintu

    2015-01-01

    This research examines the strengths, weaknesses, opportunities and threats (SWOT) to integrating e-learning perceived by academic staff at a university in Uganda and a university in Tanzania. Mixed-methods research was used in which a main qualitative study was complemented by a quantitative method. The sample participants were academic staff…

  19. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  20. Toward University Modeling Instruction—Biology: Adapting Curricular Frameworks from Physics to Biology

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence. PMID:23737628

  1. Toward university modeling instruction--biology: adapting curricular frameworks from physics to biology.

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-06-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence.

  2. The large-scale peculiar velocity field in flat models of the universe

    International Nuclear Information System (INIS)

    Vittorio, N.; Turner, M.S.

    1986-10-01

    The inflationary Universe scenario predicts a flat Universe and both adiabatic and isocurvature primordial density perturbations with the Zel'dovich spectrum. The two simplest realizations, models dominated by hot or cold dark matter, seem to be in conflict with observations. Flat models are examined with two components of mass density, where one of the components of mass density is smoothly distributed and the large-scale (≥10h -1 MpC) peculiar velocity field for these models is considered. For the smooth component relativistic particles, a relic cosmological term, and light strings are considered. At present the observational situation is unsettled; but, in principle, the large-scale peculiar velocity field is very powerful discriminator between these different models. 61 refs

  3. Model of a multiverse providing the dark energy of our universe

    Science.gov (United States)

    Rebhan, E.

    2017-09-01

    It is shown that the dark energy presently observed in our universe can be regarded as the energy of a scalar field driving an inflation-like expansion of a multiverse with ours being a subuniverse among other parallel universes. A simple model of this multiverse is elaborated: Assuming closed space geometry, the origin of the multiverse can be explained by quantum tunneling from nothing; subuniverses are supposed to emerge from local fluctuations of separate inflation fields. The standard concept of tunneling from nothing is extended to the effect that in addition to an inflationary scalar field, matter is also generated, and that the tunneling leads to an (unstable) equilibrium state. The cosmological principle is assumed to pertain from the origin of the multiverse until the first subuniverses emerge. With increasing age of the multiverse, its spatial curvature decays exponentially so fast that, due to sharing the same space, the flatness problem of our universe resolves by itself. The dark energy density imprinted by the multiverse on our universe is time-dependent, but such that the ratio w = ϱ/(c2p) of its mass density and pressure (times c2) is time-independent and assumes a value - 1 + 𝜖 with arbitrary 𝜖 > 0. 𝜖 can be chosen so small, that the dark energy model of this paper can be fitted to the current observational data as well as the cosmological constant model.

  4. Changing the Business Model of a Distance Teaching University

    NARCIS (Netherlands)

    Koper, Rob

    2014-01-01

    Reference: Koper, E.J.R. (2014) Changing the Business Model of a Distance Teaching University. In R. Huang, Kinshuk, Price, J.K. (eds.), ICT in Education in Global Context: emerging trends report 2013-2014, Lecture Notes in Educational Technology, Heidelberg: Springer Verlag, pp. 185-203 ISBN

  5. Expat University Professors' State of Psychological Well-Being and Academic Optimism towards University Task in UAE

    Directory of Open Access Journals (Sweden)

    Luis Guanzon Rile Jr.

    2015-06-01

    Full Text Available This study explored the state of psychological well-being and academic optimism in relation to university tasks among one hundred sixty-nine (169 professors in selected UAE universities, utilizing mixed quantitative and qualitative research approaches. The quantitative aspect primarily employed descriptive correlation method which used quantifiable data through survey instruments on psychological well-being, academic optimism, and university tasks. The qualitative analysis was used through a focused group discussion among nineteen (19 key informants. Six (6 areas of psychological wellbeing: autonomy, environmental mastery, personal growth, positive relations, purpose in life, and selfacceptance were measured through the Ryff's Scales of Psychological Well-Being. Academic optimism scale measured three (3 subscales: efficacy, trust, and academic emphasis. University tasks were categorized into three (3 major areas: student centered work, professional development work, and community centered work. The moderator variables considered were age, gender, length of teaching experience, length of experience in the UAE, and area of specialization. The results showed that the participants tend towards high scores in the subscales of autonomy, self-acceptance, and purpose in life. The academic optimism scale showed prominent high scores in efficacy and trust. Among the university tasks, student-centered work was the most fulfilled. Using the focused-group discussion, most expat university professors lament on the lack of time, management support, and lack of funding to pursue professional development, particularly research and publication. The regression analysis showed that there is a significant correlation between psychological well-being and academic optimism. Both psychological well-being and academic optimism predicts fulfillment of university tasks.

  6. Quantitative Modelling of Trace Elements in Hard Coal.

    Science.gov (United States)

    Smoliński, Adam; Howaniec, Natalia

    2016-01-01

    The significance of coal in the world economy remains unquestionable for decades. It is also expected to be the dominant fossil fuel in the foreseeable future. The increased awareness of sustainable development reflected in the relevant regulations implies, however, the need for the development and implementation of clean coal technologies on the one hand, and adequate analytical tools on the other. The paper presents the application of the quantitative Partial Least Squares method in modeling the concentrations of trace elements (As, Ba, Cd, Co, Cr, Cu, Mn, Ni, Pb, Rb, Sr, V and Zn) in hard coal based on the physical and chemical parameters of coal, and coal ash components. The study was focused on trace elements potentially hazardous to the environment when emitted from coal processing systems. The studied data included 24 parameters determined for 132 coal samples provided by 17 coal mines of the Upper Silesian Coal Basin, Poland. Since the data set contained outliers, the construction of robust Partial Least Squares models for contaminated data set and the correct identification of outlying objects based on the robust scales were required. These enabled the development of the correct Partial Least Squares models, characterized by good fit and prediction abilities. The root mean square error was below 10% for all except for one the final Partial Least Squares models constructed, and the prediction error (root mean square error of cross-validation) exceeded 10% only for three models constructed. The study is of both cognitive and applicative importance. It presents the unique application of the chemometric methods of data exploration in modeling the content of trace elements in coal. In this way it contributes to the development of useful tools of coal quality assessment.

  7. Quantitative Assessment of Theses at Mazandaran University of Medical Sciences Years-(1995-2014).

    Science.gov (United States)

    Balaghafari, Azita; Siamian, Hasan; Kharamin, Farideh; Rashida, Seyyedeh Shahrbanoo; Ghahrani, Nassim

    2016-07-16

    Review and evaluation of research for the correct steps towards real progress is essential which is a healthy and dynamic feature of the system. For the correct step toward real progress, evaluation research is essential which is feature of healthy and dynamic system. Considering the importance of scientific thesis in production and development and be aware of as the lack of structured information and qualitative and quantitative assessment at Mazandaran University of Medical Sciences, therefore we decided to do qualitative stud of theirs prepared 1995-2014. This study was a descriptive survey, a sample of 325 graduate and PhD thesis and dissertation in clinical and basic science at the university of medical sciences of the population in 2060 is a thesis from 1994 to the end of 2014. To study the population, stratified sampling method was used. The descriptive study was conducted in terms of matching the degree thesis students, theses subjects, specialty of supervisors and Advisers. The data gathering tool was checklist of information (gender, discipline, degree and department education of students, School, year of dependence, title of theses and dissertations, specialty and departments of supervisors and advisers, type of research, grade obtained of students). Statistical analysis of the data was performed using 21 SPSS software. We studied 325 theses; 303 dissertations which 1 researcher; 21 dissertations which 2 researchers and 1 dissertation with 3 researchers. A total of 348 students (174 females and 174 males) researcher had theses. The number of students in the Department of Basic Science 82 (23.5%), 266 (76.5 %) in clinical group; 29(8.33%), 29 (8.33%) master degree; 260 (74.71%) general practitioner; 58 (16.67%) specialty and 1(29) at the PhD level. There was no relationship between research and level of education (p = 0.081). However, it was found that majority of the theses for the general practitioner (59.8%) wryer type 1(status condition). By matching

  8. A study on the quantitative model of human response time using the amount and the similarity of information

    International Nuclear Information System (INIS)

    Lee, Sung Jin

    2006-02-01

    The mental capacity to retain or recall information, or memory is related to human performance during processing of information. Although a large number of studies have been carried out on human performance, little is known about the similarity effect. The purpose of this study was to propose and validate a quantitative and predictive model on human response time in the user interface with the basic concepts of information amount, similarity and degree of practice. It was difficult to explain human performance by only similarity or information amount. There were two difficulties: constructing a quantitative model on human response time and validating the proposed model by experimental work. A quantitative model based on the Hick's law, the law of practice and similarity theory was developed. The model was validated under various experimental conditions by measuring the participants' response time in the environment of a computer-based display. Human performance was improved by degree of similarity and practice in the user interface. Also we found the age-related human performance which was degraded as he or she was more elder. The proposed model may be useful for training operators who will handle some interfaces and predicting human performance by changing system design

  9. A Cytomorphic Chip for Quantitative Modeling of Fundamental Bio-Molecular Circuits.

    Science.gov (United States)

    2015-08-01

    We describe a 0.35 μm BiCMOS silicon chip that quantitatively models fundamental molecular circuits via efficient log-domain cytomorphic transistor equivalents. These circuits include those for biochemical binding with automatic representation of non-modular and loading behavior, e.g., in cascade and fan-out topologies; for representing variable Hill-coefficient operation and cooperative binding; for representing inducer, transcription-factor, and DNA binding; for probabilistic gene transcription with analogic representations of log-linear and saturating operation; for gain, degradation, and dynamics of mRNA and protein variables in transcription and translation; and, for faithfully representing biological noise via tunable stochastic transistor circuits. The use of on-chip DACs and ADCs enables multiple chips to interact via incoming and outgoing molecular digital data packets and thus create scalable biochemical reaction networks. The use of off-chip digital processors and on-chip digital memory enables programmable connectivity and parameter storage. We show that published static and dynamic MATLAB models of synthetic biological circuits including repressilators, feed-forward loops, and feedback oscillators are in excellent quantitative agreement with those from transistor circuits on the chip. Computationally intensive stochastic Gillespie simulations of molecular production are also rapidly reproduced by the chip and can be reliably tuned over the range of signal-to-noise ratios observed in biological cells.

  10. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  11. Private Venture Capital’s Investment on University Spin-Offs: A Case Study of Tsinghua University Based on Triple Helix Model

    DEFF Research Database (Denmark)

    Gao, Yuchen; Hu, Yimei; Wang, Jingyi

    2015-01-01

    and transition economies where governments are transforming their roles. Thus the main purpose of this study is to investigate how private venture capitals’ investment willingness on university spin-offs are influenced by universities and governments under the Chinese context based on the triple helix model....... Through an in-depth case study on the interactions of triple helix actors of Tsinghua University’s spin-offs, it is found that government and university developing an environment of marketization exert positive influences on the investment willingness of private venture capitals. Whilst financial direct...

  12. Quantitative structure-activity relationship (QSAR) for insecticides: development of predictive in vivo insecticide activity models.

    Science.gov (United States)

    Naik, P K; Singh, T; Singh, H

    2009-07-01

    Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.

  13. Quantitative structure activity relationship model for predicting the depletion percentage of skin allergic chemical substances of glutathione

    International Nuclear Information System (INIS)

    Si Hongzong; Wang Tao; Zhang Kejun; Duan Yunbo; Yuan Shuping; Fu Aiping; Hu Zhide

    2007-01-01

    A quantitative model was developed to predict the depletion percentage of glutathione (DPG) compounds by gene expression programming (GEP). Each kind of compound was represented by several calculated structural descriptors involving constitutional, topological, geometrical, electrostatic and quantum-chemical features of compounds. The GEP method produced a nonlinear and five-descriptor quantitative model with a mean error and a correlation coefficient of 10.52 and 0.94 for the training set, 22.80 and 0.85 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones, better than those of the heuristic method

  14. Designs that make a difference: the Cardiac Universal Bed model.

    Science.gov (United States)

    Johnson, Jackie; Brown, Katherine Kay; Neal, Kelly

    2003-01-01

    Information contained in this article includes some of the findings from a joint research project conducted by Corazon Consulting and Ohio State University Medical Center on national trends in Cardiac Universal Bed (CUB) utilization. This article outlines current findings and "best practice" standards related to the benefits of developing care delivery models to differentiate an organization with a competitive advantage in the highly dynamic marketplace of cardiovascular care. (OSUMC, a Corazon client, is incorporating the CUB into their Ross Heart Hospital slated to open this spring.)

  15. Quantitative assessment of target dependence of pion fluctuation in ...

    Indian Academy of Sciences (India)

    journal of. December 2012 physics pp. 1395–1405. Quantitative assessment ... The analysis reveals the erratic behaviour of the produced pions signifying ..... authors (Sitaram Pal) gratefully acknowledges the financial help from the University.

  16. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  17. Quantitative model for the generic 3D shape of ICMEs at 1 AU

    Science.gov (United States)

    Démoulin, P.; Janvier, M.; Masías-Meza, J. J.; Dasso, S.

    2016-10-01

    Context. Interplanetary imagers provide 2D projected views of the densest plasma parts of interplanetary coronal mass ejections (ICMEs), while in situ measurements provide magnetic field and plasma parameter measurements along the spacecraft trajectory, that is, along a 1D cut. The data therefore only give a partial view of the 3D structures of ICMEs. Aims: By studying a large number of ICMEs, crossed at different distances from their apex, we develop statistical methods to obtain a quantitative generic 3D shape of ICMEs. Methods: In a first approach we theoretically obtained the expected statistical distribution of the shock-normal orientation from assuming simple models of 3D shock shapes, including distorted profiles, and compared their compatibility with observed distributions. In a second approach we used the shock normal and the flux rope axis orientations together with the impact parameter to provide statistical information across the spacecraft trajectory. Results: The study of different 3D shock models shows that the observations are compatible with a shock that is symmetric around the Sun-apex line as well as with an asymmetry up to an aspect ratio of around 3. Moreover, flat or dipped shock surfaces near their apex can only be rare cases. Next, the sheath thickness and the ICME velocity have no global trend along the ICME front. Finally, regrouping all these new results and those of our previous articles, we provide a quantitative ICME generic 3D shape, including the global shape of the shock, the sheath, and the flux rope. Conclusions: The obtained quantitative generic ICME shape will have implications for several aims. For example, it constrains the output of typical ICME numerical simulations. It is also a base for studying the transport of high-energy solar and cosmic particles during an ICME propagation as well as for modeling and forecasting space weather conditions near Earth.

  18. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  19. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models.

    Science.gov (United States)

    Allen, R J; Rieger, T R; Musante, C J

    2016-03-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed "virtual patients." In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations.

  20. Evaluation of aluminum pit corrosion in oak ridge research reactor pool by quantitative imaging and thermodynamic modeling

    International Nuclear Information System (INIS)

    Jang, Ping-Rey; Arunkumar, Rangaswami; Lindner, Jeffrey S.; Long, Zhiling; Mott, Melissa A.; Okhuysen, Walter P.; Monts, David L.; Su, Yi; Kirk, Paula G.; Ettien, John

    2007-01-01

    The Oak Ridge Research Reactor (ORRR) was operated as an isotope production and irradiation facility from March 1958 until March 1987. The US Department of Energy permanently shut down and removed the fuel from the ORRR in 1987. The water level must be maintained in the ORRR pool as shielding for radioactive components still located in the pool. The U.S. Department of Energy's Office of Environmental Management (DOE EM) needs to decontaminate and demolish the ORRR as part of the Oak Ridge cleanup program. In February 2004, increased pit corrosion was noted in the pool's 6 mm (1/4'')-thick aluminum liner in the section nearest where the radioactive components are stored. If pit corrosion has significantly penetrated the aluminum liner, then DOE EM must accelerate its decontaminating and decommissioning (D and D) efforts or look for alternatives for shielding the irradiated components. The goal of Mississippi State University's Institute for Clean Energy Technology (ICET) was to provide a determination of the extent and depth of corrosion and to conduct thermodynamic modeling to determine how further corrosion can be inhibited. Results from the work will facilitate ORNL in making reliable disposition decisions. ICET's inspection approach was to quantitatively estimate the amount of corrosion by using Fourier - transform profilometry (FTP). FTP is a non-contact 3- D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, the system is capable of determining the height (depth) distribution of the target surface, thus reproducing the profile of the target accurately. ICET has previously demonstrated that its FTP system can quantitatively estimate the volume and depth of removed and residual material to high accuracy. The results of our successful initial deployment of a submergible FTP system into the ORRR pool are reported here as are initial thermodynamic

  1. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  2. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  3. A quantitative speciation model for the adsorption of organic pollutants on activated carbon.

    Science.gov (United States)

    Grivé, M; García, D; Domènech, C; Richard, L; Rojo, I; Martínez, X; Rovira, M

    2013-01-01

    Granular activated carbon (GAC) is commonly used as adsorbent in water treatment plants given its high capacity for retaining organic pollutants in aqueous phase. The current knowledge on GAC behaviour is essentially empirical, and no quantitative description of the chemical relationships between GAC surface groups and pollutants has been proposed. In this paper, we describe a quantitative model for the adsorption of atrazine onto GAC surface. The model is based on results of potentiometric titrations and three types of adsorption experiments which have been carried out in order to determine the nature and distribution of the functional groups on the GAC surface, and evaluate the adsorption characteristics of GAC towards atrazine. Potentiometric titrations have indicated the existence of at least two different families of chemical groups on the GAC surface, including phenolic- and benzoic-type surface groups. Adsorption experiments with atrazine have been satisfactorily modelled with the geochemical code PhreeqC, assuming that atrazine is sorbed onto the GAC surface in equilibrium (log Ks = 5.1 ± 0.5). Independent thermodynamic calculations suggest a possible adsorption of atrazine on a benzoic derivative. The present work opens a new approach for improving the adsorption capabilities of GAC towards organic pollutants by modifying its chemical properties.

  4. Development of quantitative atomic modeling for tungsten transport study Using LHD plasma with tungsten pellet injection

    International Nuclear Information System (INIS)

    Murakami, I.; Sakaue, H.A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2014-10-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from currentless plasmas of the Large Helical Device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) lines of W 24+ to W 33+ ions are very sensitive to electron temperature (Te) and useful to examine the tungsten behavior in edge plasmas. Based on the first quantitative analysis of measured spatial profile of W 44+ ion, the tungsten concentration is determined to be n(W 44+ )/n e = 1.4x10 -4 and the total radiation loss is estimated as ∼4 MW, of which the value is roughly half the total NBI power. (author)

  5. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    Directory of Open Access Journals (Sweden)

    Mondry Adrian

    2004-08-01

    Full Text Available Abstract Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods

  6. Application of a Duration Model in Programs for Prevention of University Attrition

    Directory of Open Access Journals (Sweden)

    Verónica Herrero

    2013-12-01

    Full Text Available Institutional practices related to the prevention of desertion of university students increasingly require validated instruments in order to anticipate such behavior. In this regard, different statistical models generated from information related to the students themselves, their homes, their academic performance, among other determinants have demonstrated to be of crucial value. This study aims to demonstrate the importance of a series of determinants explored in other studies. The main objective is to apply a dropout rate predictive model with at risk university students in order to generate early and progressively more effective results. The research demonstrates the usefulness of the duration models in a sample of classroom students and the capacity to anticipate behavior of permanence/attrition across time. This was done with risk estimates using the Cox model.

  7. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  8. A University/Community Collaborative Model on Empowerment in Elementary Education.

    Science.gov (United States)

    Goeke, John C.; And Others

    1995-01-01

    Collaboration is growing among schools and community services for youth, their families, and now, university graduate programs. Proposes a structural model for collaboration which implements the concept of empowerment and designs sustainable working relationships over time. (DR)

  9. Universality and clustering in 1 + 1 dimensional superstring-bit models

    International Nuclear Information System (INIS)

    Bergman, O.; Thorn, C.B.

    1996-01-01

    We construct a 1+1 dimensional superstring-bit model for D=3 Type IIB superstring. This low dimension model escapes the problem encountered in higher dimension models: (1) It possesses full Galilean supersymmetry; (2) For noninteracting Polymers of bits, the exactly soluble linear superpotential describing bit interactions is in a large universality class of superpotentials which includes ones bounded at spatial infinity; (3) The latter are used to construct a superstring-bit model with the clustering properties needed to define an S-matrix for closed polymers of superstring-bits

  10. A robust quantitative near infrared modeling approach for blend monitoring.

    Science.gov (United States)

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  11. FROM THE ORIENTATION OF MARKETING TO BUSINESS MODEL – A MORE ENTREPRENEURIAL UNIVERSITY

    Directory of Open Access Journals (Sweden)

    MIHAELA DIACONU

    2014-10-01

    Full Text Available In the actual situation of higher education market, characterized by intense competition and government underfunding, the university must find that approach by which to be competitive and sustainable. It is imperative for the university to identify that business model which can facilitate the implementation of an appropriate strategy by which it can be assured the value for both external customers (students, employers, society and its own employees.The university should identify successful business models that allow it to constantly adapt to an increasingly dynamic. It is necessary to rigorously base the allocation of available resources and to properly capitalize especially the scientific research results to ensure competitiveness, in other words, to become more entrepreneurial.

  12. Spacetime emergence of the robertson-walker universe from a matrix model.

    Science.gov (United States)

    Erdmenger, Johanna; Meyer, René; Park, Jeong-Hyuck

    2007-06-29

    Using a novel, string theory-inspired formalism based on a Hamiltonian constraint, we obtain a conformal mechanical system for the spatially flat four-dimensional Robertson-Walker Universe. Depending on parameter choices, this system describes either a relativistic particle in the Robertson-Walker background or metric fluctuations of the Robertson-Walker geometry. Moreover, we derive a tree-level M theory matrix model in this time-dependent background. Imposing the Hamiltonian constraint forces the spacetime geometry to be fuzzy near the big bang, while the classical Robertson-Walker geometry emerges as the Universe expands. From our approach, we also derive the temperature of the Universe interpolating between the radiation and matter dominated eras.

  13. A quantitative dynamic systems model of health-related quality of life among older adults

    Science.gov (United States)

    Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela

    2015-01-01

    Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722

  14. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  15. Novel mathematic models for quantitative transitivity of quality-markers in extraction process of the Buyanghuanwu decoction.

    Science.gov (United States)

    Zhang, Yu-Tian; Xiao, Mei-Feng; Deng, Kai-Wen; Yang, Yan-Tao; Zhou, Yi-Qun; Zhou, Jin; He, Fu-Yuan; Liu, Wen-Long

    2018-06-01

    Nowadays, to research and formulate an efficiency extraction system for Chinese herbal medicine, scientists have always been facing a great challenge for quality management, so that the transitivity of Q-markers in quantitative analysis of TCM was proposed by Prof. Liu recently. In order to improve the quality of extraction from raw medicinal materials for clinical preparations, a series of integrated mathematic models for transitivity of Q-markers in quantitative analysis of TCM were established. Buyanghuanwu decoction (BYHWD) was a commonly TCMs prescription, which was used to prevent and treat the ischemic heart and brain diseases. In this paper, we selected BYHWD as an extraction experimental subject to study the quantitative transitivity of TCM. Based on theory of Fick's Rule and Noyes-Whitney equation, novel kinetic models were established for extraction of active components. Meanwhile, fitting out kinetic equations of extracted models and then calculating the inherent parameters in material piece and Q-marker quantitative transfer coefficients, which were considered as indexes to evaluate transitivity of Q-markers in quantitative analysis of the extraction process of BYHWD. HPLC was applied to screen and analyze the potential Q-markers in the extraction process. Fick's Rule and Noyes-Whitney equation were adopted for mathematically modeling extraction process. Kinetic parameters were fitted and calculated by the Statistical Program for Social Sciences 20.0 software. The transferable efficiency was described and evaluated by potential Q-markers transfer trajectory via transitivity availability AUC, extraction ratio P, and decomposition ratio D respectively. The Q-marker was identified with AUC, P, D. Astragaloside IV, laetrile, paeoniflorin, and ferulic acid were studied as potential Q-markers from BYHWD. The relative technologic parameters were presented by mathematic models, which could adequately illustrate the inherent properties of raw materials

  16. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  17. Interacting polytropic gas model of phantom dark energy in non-flat universe

    International Nuclear Information System (INIS)

    Karami, K.; Ghaffari, S.; Fehri, J.

    2009-01-01

    By introducing the polytropic gas model of interacting dark energy, we obtain the equation of state for the polytropic gas energy density in a non-flat universe. We show that for an even polytropic index by choosing K>Ba (3)/(n) , one can obtain ω Λ eff <-1, which corresponds to a universe dominated by phantom dark energy. (orig.)

  18. Interplay of universality classes in a three-dimensional Yukawa model

    International Nuclear Information System (INIS)

    Focht, E.; Jersak, J.; Paul, J.

    1996-01-01

    We investigate numerically on the lattice the interplay of universality classes of the three-dimensional Yukawa model with U(1) chiral symmetry, using the Binder method of finite size scaling. At zero Yukawa coupling the scaling related to the magnetic Wilson-Fisher fixed point is confirmed. At sufficiently strong Yukawa coupling the dominance of the chiral fixed point associated with the 3D Gross-Neveu model is observed for various values of the coupling parameters, including infinite scalar self-coupling. In both cases the Binder method works consistently in a broad range of lattice sizes. However, when the Yukawa coupling is decreased the finite size behavior gets complicated and the Binder method gives inconsistent results for different lattice sizes. This signals a crossover between the universality classes of the two fixed points. copyright 1996 The American Physical Society

  19. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  20. Quantitative model of super-Arrhenian behavior in glass forming materials

    Science.gov (United States)

    Caruthers, J. M.; Medvedev, G. A.

    2018-05-01

    The key feature of glass forming liquids is the super-Arrhenian temperature dependence of the mobility, where the mobility can increase by ten orders of magnitude or more as the temperature is decreased if crystallization does not intervene. A fundamental description of the super-Arrhenian behavior has been developed; specifically, the logarithm of the relaxation time is a linear function of 1 /U¯x , where U¯x is the independently determined excess molar internal energy and B is a material constant. This one-parameter mobility model quantitatively describes data for 21 glass forming materials, which are all the materials where there are sufficient experimental data for analysis. The effect of pressure on the loga mobility is also described using the same U¯x(T ,p ) function determined from the difference between the liquid and crystalline internal energies. It is also shown that B is well correlated with the heat of fusion. The prediction of the B /U¯x model is compared to the Adam and Gibbs 1 /T S¯x model, where the B /U¯x model is significantly better in unifying the full complement of mobility data. The implications of the B /U¯x model for the development of a fundamental description of glass are discussed.

  1. The Monash University Interactive Simple Climate Model

    Science.gov (United States)

    Dommenget, D.

    2013-12-01

    The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.

  2. The IB Diploma and UK University Degree Qualifications

    Science.gov (United States)

    Frank-Gemmill, Gerda

    2013-01-01

    In recent years the International Baccalaureate (IB) Diploma has become widely accepted as a university-entry qualification in the UK, but there has been little quantitative research into the achievements of IB students at degree level. This study investigates IB students from one selective independent school who entered UK universities between…

  3. Spatiotemporal microbiota dynamics from quantitative in vitro and in silico models of the gut

    Science.gov (United States)

    Hwa, Terence

    The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth behaviors, which ultimately dictate the gut microbiota composition. Combining measurements of bacterial growth physiology with analysis of published data on human physiology into a quantitative modeling framework, we show how hydrodynamic forces in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla in the gut. Our model quantitatively explains the observed variation of microbiota composition among healthy adults, and predicts colonic water absorption (manifested as stool consistency) and nutrient intake to be two key factors determining this composition. The model further reveals that both factors, which have been identified in recent correlative studies, exert their effects through the same mechanism: changes in colonic pH that differentially affect the growth of different bacteria. Our findings show that a predictive and mechanistic understanding of microbial ecology in the human gut is possible, and offer the hope for the rational design of intervention strategies to actively control the microbiota. This work is supported by the Bill and Melinda Gates Foundation.

  4. A quantitative model of the cardiac ventricular cell incorporating the transverse-axial tubular system

    Czech Academy of Sciences Publication Activity Database

    Pásek, Michal; Christé, G.; Šimurda, J.

    2003-01-01

    Roč. 22, č. 3 (2003), s. 355-368 ISSN 0231-5882 R&D Projects: GA ČR GP204/02/D129 Institutional research plan: CEZ:AV0Z2076919 Keywords : cardiac cell * tubular system * quantitative modelling Subject RIV: BO - Biophysics Impact factor: 0.794, year: 2003

  5. On distinguishing different models of a class of emergent Universe ...

    Indian Academy of Sciences (India)

    Souvik Ghose

    2018-02-20

    Feb 20, 2018 ... the same class of EU in light of union compilation data (SNIa) which consists of over a hundred data points, thus ... Dark energy; emergent Universe; observational data. .... μ vs. z curve for different EU models along with the.

  6. New physics beyond the standard model of particle physics and parallel universes

    Energy Technology Data Exchange (ETDEWEB)

    Plaga, R. [Franzstr. 40, 53111 Bonn (Germany)]. E-mail: rainer.plaga@gmx.de

    2006-03-09

    It is shown that if-and only if-'parallel universes' exist, an electroweak vacuum that is expected to have decayed since the big bang with a high probability might exist. It would neither necessarily render our existence unlikely nor could it be observed. In this special case the observation of certain combinations of Higgs-boson and top-quark masses-for which the standard model predicts such a decay-cannot be interpreted as evidence for new physics at low energy scales. The question of whether parallel universes exist is of interest to our understanding of the standard model of particle physics.

  7. Towards a universal model of reading.

    Science.gov (United States)

    Frost, Ram

    2012-10-01

    In the last decade, reading research has seen a paradigmatic shift. A new wave of computational models of orthographic processing that offer various forms of noisy position or context-sensitive coding have revolutionized the field of visual word recognition. The influx of such models stems mainly from consistent findings, coming mostly from European languages, regarding an apparent insensitivity of skilled readers to letter order. Underlying the current revolution is the theoretical assumption that the insensitivity of readers to letter order reflects the special way in which the human brain encodes the position of letters in printed words. The present article discusses the theoretical shortcomings and misconceptions of this approach to visual word recognition. A systematic review of data obtained from a variety of languages demonstrates that letter-order insensitivity is neither a general property of the cognitive system nor a property of the brain in encoding letters. Rather, it is a variant and idiosyncratic characteristic of some languages, mostly European, reflecting a strategy of optimizing encoding resources, given the specific structure of words. Since the main goal of reading research is to develop theories that describe the fundamental and invariant phenomena of reading across orthographies, an alternative approach to model visual word recognition is offered. The dimensions of a possible universal model of reading, which outlines the common cognitive operations involved in orthographic processing in all writing systems, are discussed.

  8. Budgeting and spending habits of university students in South Africa ...

    African Journals Online (AJOL)

    The aim of this study was to investigate the budgeting and spending habits of university students at a South African university. In addition, the study examined if there is a significant gender difference in the budgeting and spending habits of university students. The study adopted a quantitative research approach with a ...

  9. Modelling and Quantitative Analysis of LTRACK–A Novel Mobility Management Algorithm

    Directory of Open Access Journals (Sweden)

    Benedek Kovács

    2006-01-01

    Full Text Available This paper discusses the improvements and parameter optimization issues of LTRACK, a recently proposed mobility management algorithm. Mathematical modelling of the algorithm and the behavior of the Mobile Node (MN are used to optimize the parameters of LTRACK. A numerical method is given to determine the optimal values of the parameters. Markov chains are used to model both the base algorithm and the so-called loop removal effect. An extended qualitative and quantitative analysis is carried out to compare LTRACK to existing handover mechanisms such as MIP, Hierarchical Mobile IP (HMIP, Dynamic Hierarchical Mobility Management Strategy (DHMIP, Telecommunication Enhanced Mobile IP (TeleMIP, Cellular IP (CIP and HAWAII. LTRACK is sensitive to network topology and MN behavior so MN movement modelling is also introduced and discussed with different topologies. The techniques presented here can not only be used to model the LTRACK algorithm, but other algorithms too. There are many discussions and calculations to support our mathematical model to prove that it is adequate in many cases. The model is valid on various network levels, scalable vertically in the ISO-OSI layers and also scales well with the number of network elements.

  10. Quantitative Modeling of Membrane Transport and Anisogamy by Small Groups Within a Large-Enrollment Organismal Biology Course

    Directory of Open Access Journals (Sweden)

    Eric S. Haag

    2016-12-01

    Full Text Available Quantitative modeling is not a standard part of undergraduate biology education, yet is routine in the physical sciences. Because of the obvious biophysical aspects, classes in anatomy and physiology offer an opportunity to introduce modeling approaches to the introductory curriculum. Here, we describe two in-class exercises for small groups working within a large-enrollment introductory course in organismal biology. Both build and derive biological insights from quantitative models, implemented using spreadsheets. One exercise models the evolution of anisogamy (i.e., small sperm and large eggs from an initial state of isogamy. Groups of four students work on Excel spreadsheets (from one to four laptops per group. The other exercise uses an online simulator to generate data related to membrane transport of a solute, and a cloud-based spreadsheet to analyze them. We provide tips for implementing these exercises gleaned from two years of experience.

  11. Competency-Based University Undergraduate Teaching Management: Proposal for a Conceptual Model

    Directory of Open Access Journals (Sweden)

    Rodolfo Schmal

    2005-12-01

    Full Text Available The human resources societies and their organizations can count on are more and more relevant. In that sense, a major challenge faced by universities is to give students the appropriate background to be professionals with the profile the current scenario requires. This article focuses the management of university careers. Historically, many careers have emphasized knowledge, especially abstract knowledge. Today, the trend is to address aspects that reach beyond cognition, and focus the attention in effective competencies that include procedures and attitudes. Such approach allows the opportunity of defining a holistic management of careers, reaching beyond the sheer teaching of disciplines. Concurrently, the availability of information methods and tools will contribute for the definition and implementation of a design process that can work with explicit criteria and transformations. The article proposes a conceptual model to represent the objects, and their attributes and associations that are considered of interest for the management of university teaching under a competency focus. A second stage should implement such model through the construction of an information system that supports the management of corresponding careers.

  12. Universal amplitude ratios in the 3D Ising model

    International Nuclear Information System (INIS)

    Caselle, M.; Hasenbusch, M.

    1998-01-01

    We present a high precision Monte Carlo study of various universal amplitude ratios of the three dimensional Ising spin model. Using state of the art simulation techniques we studied the model close to criticality in both phases. Great care was taken to control systematic errors due to finite size effects and correction to scaling terms. We obtain C + /C - =4.75(3), f +,2nd /f -,2nd =1.95(2) and u * =14.3(1). Our results are compatible with those obtained by field theoretic methods applied to the φ 4 theory and high and low temperature series expansions of the Ising model. (orig.)

  13. A likely universal model of fracture scaling and its consequence for crustal hydromechanics

    Science.gov (United States)

    Davy, P.; Le Goc, R.; Darcel, C.; Bour, O.; de Dreuzy, J. R.; Munier, R.

    2010-10-01

    We argue that most fracture systems are spatially organized according to two main regimes: a "dilute" regime for the smallest fractures, where they can grow independently of each other, and a "dense" regime for which the density distribution is controlled by the mechanical interactions between fractures. We derive a density distribution for the dense regime by acknowledging that, statistically, fractures do not cross a larger one. This very crude rule, which expresses the inhibiting role of large fractures against smaller ones but not the reverse, actually appears be a very strong control on the eventual fracture density distribution since it results in a self-similar distribution whose exponents and density term are fully determined by the fractal dimension D and a dimensionless parameter γ that encompasses the details of fracture correlations and orientations. The range of values for D and γ appears to be extremely limited, which makes this model quite universal. This theory is supported by quantitative data on either fault or joint networks. The transition between the dilute and dense regimes occurs at about a few tenths of a kilometer for faults systems and a few meters for joints. This remarkable difference between both processes is likely due to a large-scale control (localization) of the fracture growth for faulting that does not exist for jointing. Finally, we discuss the consequences of this model on the flow properties and show that these networks are in a critical state, with a large number of nodes carrying a large amount of flow.

  14. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  15. Generalized cardassian expansion: a model in which the universe is flat, matter dominated, and accelerating

    International Nuclear Information System (INIS)

    Freese, Katherine

    2003-01-01

    The Cardassian universe is a proposed modification to the Friedmann Robertson Walker equation (FRW) in which the universe is flat, matter dominated, and accelerating. In this presentation, we generalize the original Cardassian proposal to include additional variants on the FRW equation, specific examples are presented. In the ordinary FRW equation, the right hand side is a linear function of the energy density, H 2 ∼ ρ. Here, instead, the right hand side of the FRW equation is a different function of the energy density, H 2 ∼ g(ρ). This function returns to ordinary FRW at early times, but modifies the expansion at a late epoch of the universe. The only ingredients in this universe are matter and radiation: in particular, there is NO vacuum contribution. Currently the modification of the FRW equation is such that the universe accelerates; we call this period of acceleration the Cardassian era. The universe can be flat and yet consist of only matter and radiation, and still be compatible with observations. The energy density required to close the universe is much smaller than in a standard cosmology, so that matter can be sufficient to provide a flat geometry. The new term required may arise, e.g., as a consequence of our observable universe living as a 3-dimensional brane in a higher dimensional universe. The Cardassian model survives several observational tests, including the cosmic background radiation, the age of the universe, the Friedmann Robertson , and structure formation. As will be shown in future work, he predictions for observational tests of the generalized Cardassian models can be very different from generic quintessence models, whether the equation of state is constant or time dependent

  16. Quantitative structure-activity relationship modeling of the toxicity of organothiophosphate pesticides to Daphnia magna and Cyprinus carpio

    NARCIS (Netherlands)

    Zvinavashe, E.; Du, T.; Griff, T.; Berg, van den J.H.J.; Soffers, A.E.M.F.; Vervoort, J.J.M.; Murk, A.J.; Rietjens, I.

    2009-01-01

    Within the REACH regulatory framework in the EU, quantitative structure-activity relationships (QSAR) models are expected to help reduce the number of animals used for experimental testing. The objective of this study was to develop QSAR models to describe the acute toxicity of organothiophosphate

  17. Physiological role of Kv1.3 channel in T lymphocyte cell investigated quantitatively by kinetic modeling.

    Directory of Open Access Journals (Sweden)

    Panpan Hou

    Full Text Available Kv1.3 channel is a delayed rectifier channel abundant in human T lymphocytes. Chronic inflammatory and autoimmune disorders lead to the over-expression of Kv1.3 in T cells. To quantitatively study the regulatory mechanism and physiological function of Kv1.3 in T cells, it is necessary to have a precise kinetic model of Kv1.3. In this study, we firstly established a kinetic model capable to precisely replicate all the kinetic features for Kv1.3 channels, and then constructed a T-cell model composed of ion channels including Ca2+-release activated calcium (CRAC channel, intermediate K+ (IK channel, TASK channel and Kv1.3 channel for quantitatively simulating the changes in membrane potentials and local Ca2+ signaling messengers during activation of T cells. Based on the experimental data from current-clamp recordings, we successfully demonstrated that Kv1.3 dominated the membrane potential of T cells to manipulate the Ca2+ influx via CRAC channel. Our results revealed that the deficient expression of Kv1.3 channel would cause the less Ca2+ signal, leading to the less efficiency in secretion. This was the first successful attempt to simulate membrane potential in non-excitable cells, which laid a solid basis for quantitatively studying the regulatory mechanism and physiological role of channels in non-excitable cells.

  18. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    Science.gov (United States)

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-06-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented.

  19. Scalar perturbations in the late Universe: viability of the Chaplygin gas models

    Energy Technology Data Exchange (ETDEWEB)

    Bouhmadi-López, Mariam [Departamento de Física, Universidade da Beira Interior, 6200 Covilhã (Portugal); Brilenkov, Maxim; Brilenkov, Ruslan [Department of Theoretical Physics, Odessa National University, Dvoryanskaya st. 2, Odessa 65082 (Ukraine); Morais, João [Department of Theoretical Physics, University of the Basque Country UPV/EHU, P.O. Box 644, 48080 Bilbao (Spain); Zhuk, Alexander, E-mail: mbl@ubi.pt, E-mail: maxim.brilenkov@gmail.com, E-mail: ruslan.brilenkov@gmail.com, E-mail: jviegas001@ikasle.ehu.eus, E-mail: ai.zhuk2@gmail.com [Astronomical Observatory, Odessa National University, Dvoryanskaya st. 2, Odessa 65082 (Ukraine)

    2015-12-01

    We study the late-time evolution of the Universe where dark energy (DE) is parametrised by a modified generalised Chaplygin gas (mGCG) on top of cold dark matter (CDM) . We also take into account the radiation content of the Universe. In this context, the late stage of the evolution of the universe refers to the epoch where CDM is already clustered into inhomogeneously distributed discrete structures (galaxies, groups and clusters of galaxies). Under these conditions, the mechanical approach is an adequate tool to study the Universe deep inside the cell of uniformity. To be more accurate, we study scalar perturbations of the Friedmann-Lemaȋtre-Robertson-Walker metric due to inhomogeneities of CDM as well as fluctuations of radiation and mGCG, the later driving the late-time acceleration of the universe. Our analysis applies as well to the case where mGCG plays the role of DM and DE . We select the sets of parameters of the mGCG that are compatible with the mechanical approach. These sets define prospective mGCG models. By comparing the selected sets of models with some of the latest observational data results, we conclude that the mGCG is in tight agreement with those observations particularly for a mGCG playing the role of DE and DM.

  20. Evaluating the impact of strategic personnel policies using a MILP model: The public university case

    International Nuclear Information System (INIS)

    Torre, R. de la; Lusa, A.; Mateo, M.

    2016-01-01

    Purpose: The main purpose of the paper is to evaluate the impact of diverse personnel policies around personnel promotion in the design of the strategic staff plan for a public university. The strategic staff planning consists in the determination of the size and composition of the workforce for an organization. Design/methodology/approach: The staff planning is solved using a Mixed Integer Linear Programming (MILP) model. The MILP model represents the organizational structure of the university, the personnel categories and capacity decisions, the demand requirements, the required service level and budget restrictions. All these aspects are translated into a set of data, as well as the parameters and constraints building up the mathematical model for optimization. The required data for the model is adopted from a Spanish public university. Findings: The development of appropriate policies for personnel promotion can effectively reduce the number of dismissals while proposing a transition towards different preferable workforce structures in the university. Research limitations/implications: The long term staff plan for the university is solved by the MILP model considering a time horizon of 8 years. For this time horizon, the required input data is derived from current data of the university. Different scenarios are proposed considering different temporal trends for input data, such as in demand and admissible promotional ratios for workers. Originality/value: The literature review reports a lack of formalized procedures for staff planning in universities taking into account, at the same time, the regulations on hiring, dismissals, promotions and the workforce heterogeneity, all considered to optimize workforce size and composition addressing not only an economic criteria, but also the required workforce expertise and the quality in the service offered. This paper adopts a formalized procedure developed by the authors in previous works, and exploits it to assess the

  1. Evaluating the impact of strategic personnel policies using a MILP model: The public university case

    Energy Technology Data Exchange (ETDEWEB)

    Torre, R. de la; Lusa, A.; Mateo, M.

    2016-07-01

    Purpose: The main purpose of the paper is to evaluate the impact of diverse personnel policies around personnel promotion in the design of the strategic staff plan for a public university. The strategic staff planning consists in the determination of the size and composition of the workforce for an organization. Design/methodology/approach: The staff planning is solved using a Mixed Integer Linear Programming (MILP) model. The MILP model represents the organizational structure of the university, the personnel categories and capacity decisions, the demand requirements, the required service level and budget restrictions. All these aspects are translated into a set of data, as well as the parameters and constraints building up the mathematical model for optimization. The required data for the model is adopted from a Spanish public university. Findings: The development of appropriate policies for personnel promotion can effectively reduce the number of dismissals while proposing a transition towards different preferable workforce structures in the university. Research limitations/implications: The long term staff plan for the university is solved by the MILP model considering a time horizon of 8 years. For this time horizon, the required input data is derived from current data of the university. Different scenarios are proposed considering different temporal trends for input data, such as in demand and admissible promotional ratios for workers. Originality/value: The literature review reports a lack of formalized procedures for staff planning in universities taking into account, at the same time, the regulations on hiring, dismissals, promotions and the workforce heterogeneity, all considered to optimize workforce size and composition addressing not only an economic criteria, but also the required workforce expertise and the quality in the service offered. This paper adopts a formalized procedure developed by the authors in previous works, and exploits it to assess the

  2. Melanoma screening: Informing public health policy with quantitative modelling.

    Directory of Open Access Journals (Sweden)

    Stephen Gilmore

    Full Text Available Australia and New Zealand share the highest incidence rates of melanoma worldwide. Despite the substantial increase in public and physician awareness of melanoma in Australia over the last 30 years-as a result of the introduction of publicly funded mass media campaigns that began in the early 1980s -mortality has steadily increased during this period. This increased mortality has led investigators to question the relative merits of primary versus secondary prevention; that is, sensible sun exposure practices versus early detection. Increased melanoma vigilance on the part of the public and among physicians has resulted in large increases in public health expenditure, primarily from screening costs and increased rates of office surgery. Has this attempt at secondary prevention been effective? Unfortunately epidemiologic studies addressing the causal relationship between the level of secondary prevention and mortality are prohibitively difficult to implement-it is currently unknown whether increased melanoma surveillance reduces mortality, and if so, whether such an approach is cost-effective. Here I address the issue of secondary prevention of melanoma with respect to incidence and mortality (and cost per life saved by developing a Markov model of melanoma epidemiology based on Australian incidence and mortality data. The advantages of developing a methodology that can determine constraint-based surveillance outcomes are twofold: first, it can address the issue of effectiveness; and second, it can quantify the trade-off between cost and utilisation of medical resources on one hand, and reduced morbidity and lives saved on the other. With respect to melanoma, implementing the model facilitates the quantitative determination of the relative effectiveness and trade-offs associated with different levels of secondary and tertiary prevention, both retrospectively and prospectively. For example, I show that the surveillance enhancement that began in

  3. An overview of quantitative approaches in Gestalt perception.

    Science.gov (United States)

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Dr. Auzoux's botanical teaching models and medical education at the universities of Glasgow and Aberdeen.

    Science.gov (United States)

    Olszewski, Margaret Maria

    2011-09-01

    In the 1860s, Dr. Louis Thomas Jérôme Auzoux introduced a set of papier-mâché teaching models intended for use in the botanical classroom. These botanical models quickly made their way into the educational curricula of institutions around the world. Within these institutions, Auzoux's models were principally used to fulfil educational goals, but their incorporation into diverse curricula also suggests they were used to implement agendas beyond botanical instruction. This essay examines the various uses and meanings of Dr. Auzoux's botanical teaching models at the universities of Glasgow and Aberdeen in the nineteenth century. The two main conclusions of this analysis are: (1) investing in prestigious scientific collections was a way for these universities to attract fee-paying students so that better medical accommodation could be provided and (2) models were used to transmit different kinds of botanical knowledge at both universities. The style of botany at the University of Glasgow was offensive and the department there actively embraced and incorporated ideas of the emerging new botany. At Aberdeen, the style of botany was defensive and there was some hesitancy when confronting new botanical ideas. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Quantitative Decision Making Model for Carbon Reduction in Road Construction Projects Using Green Technologies

    Directory of Open Access Journals (Sweden)

    Woosik Jang

    2015-08-01

    Full Text Available Numerous countries have established policies for reducing greenhouse gas emissions and have suggested goals pertaining to these reductions. To reach the target reduction amounts, studies on the reduction of carbon emissions have been conducted with regard to all stages and processes in construction projects. According to a study on carbon emissions, the carbon emissions generated during the construction stage of road projects account for approximately 76 to 86% of the total carbon emissions, far exceeding the other stages, such as maintenance or demolition. Therefore, this study aims to develop a quantitative decision making model that supports the application of green technologies (GTs to reduce carbon emissions during the construction stage of road construction projects. First, the authors selected environmental soundness, economic feasibility and constructability as the key assessment indices for evaluating 20 GTs. Second, a fuzzy set/qualitative comparative analysis (FS/QCA was used to establish an objective decision-making model for the assessment of both the quantitative and qualitative characteristics of the key indices. To support the developed model, an expert survey was performed to assess the applicability of each GT from a practical perspective, which was verified with a case study using two additional GTs. The proposed model is expected to support practitioners in the application of suitable GTs to road projects and reduce carbon emissions, resulting in better decision making during road construction projects.

  6. Quantitative Assessment of Theses at Mazandaran University of Medical Sciences Years–(1995-2014)

    Science.gov (United States)

    Balaghafari, Azita; Siamian, Hasan; Kharamin, Farideh; Rashida, Seyyedeh Shahrbanoo; Ghahrani, Nassim

    2016-01-01

    Background: Review and evaluation of research for the correct steps towards real progress is essential which is a healthy and dynamic feature of the system. For the correct step toward real progress, evaluation research is essential which is feature of healthy and dynamic system. Considering the importance of scientific thesis in production and development and be aware of as the lack of structured information and qualitative and quantitative assessment at Mazandaran University of Medical Sciences, therefore we decided to do qualitative stud of theirs prepared 1995-2014. Methods: This study was a descriptive survey, a sample of 325 graduate and PhD thesis and dissertation in clinical and basic science at the university of medical sciences of the population in 2060 is a thesis from 1994 to the end of 2014. To study the population, stratified sampling method was used. The descriptive study was conducted in terms of matching the degree thesis students, theses subjects, specialty of supervisors and Advisers. The data gathering tool was checklist of information (gender, discipline, degree and department education of students, School, year of dependence, title of theses and dissertations, specialty and departments of supervisors and advisers, type of research, grade obtained of students). Statistical analysis of the data was performed using 21 SPSS software. Results: We studied 325 theses; 303 dissertations which 1 researcher; 21 dissertations which 2 researchers and 1 dissertation with 3 researchers. A total of 348 students (174 females and 174 males) researcher had theses. The number of students in the Department of Basic Science 82 (23.5%), 266 (76.5 %) in clinical group; 29(8.33%), 29 (8.33%) master degree; 260 (74.71%) general practitioner; 58 (16.67%) specialty and 1(29) at the PhD level. There was no relationship between research and level of education (p = 0.081). However, it was found that majority of the theses for the general practitioner (59.8%) wryer type 1

  7. A Proposed Model for Measuring Performance of the University-Industry Collaboration in Open Innovation

    Directory of Open Access Journals (Sweden)

    Anca Draghici

    2017-06-01

    Full Text Available The paper aims to present a scientific approach to the creation, testing and validation of a model for performance measurement for university-industry collaboration (UIC. The main idea of the design process is to capitalize on existing success factors, facilitators and opportunities (motivation factors, knowledge transfer channels and identified benefits and to diminish or avoid potential threats and barriers that might interfere with such collaborations. The main purpose of the applied methodology is to identify solutions and measures to overcome the disadvantages, conflicts or risk issues and to facilitate the open innovation of industrial companies and universities. The methodology adopted was differentiated by two perspectives: (1 a business model reflecting the university perspective along with an inventory of key performance indicators (KPIs; (2 a performance measurement model (including performance criteria and indicators and an associated methodology (assimilated to an audit that could help companies increase collaboration with universities in the context of open innovation. In addition, in order to operationalize the proposed model (facilitating practical implementation, an Excel tool has been created to help identifying potential sources of innovation. The main contributions of the research concern the expansion of UICs knowledge to enhance open innovation and to define an effective performance measurement model and instrument (tested and validated by a case study for companies.

  8. Universal core model for multiple-gate field-effect transistors with short channel and quantum mechanical effects

    Science.gov (United States)

    Shin, Yong Hyeon; Bae, Min Soo; Park, Chuntaek; Park, Joung Won; Park, Hyunwoo; Lee, Yong Ju; Yun, Ilgu

    2018-06-01

    A universal core model for multiple-gate (MG) field-effect transistors (FETs) with short channel effects (SCEs) and quantum mechanical effects (QMEs) is proposed. By using a Young’s approximation based solution for one-dimensional Poisson’s equations the total inversion charge density (Q inv ) in the channel is modeled for double-gate (DG) and surrounding-gate SG (SG) FETs, following which a universal charge model is derived based on the similarity of the solutions, including for quadruple-gate (QG) FETs. For triple-gate (TG) FETs, the average of DG and QG FETs are used. A SCEs model is also proposed considering the potential difference between the channel’s surface and center. Finally, a QMEs model for MG FETs is developed using the quantum correction compact model. The proposed universal core model is validated on commercially available three-dimensional ATLAS numerical simulations.

  9. Toward a Miami University Model for Internet-Intensive Higher Education.

    Science.gov (United States)

    Wolfe, Christopher R.; Crider, Linda; Mayer, Larry; McBride, Mark; Sherman, Richard; Vogel, Robert

    1998-01-01

    Describes principles underlying an emerging model for Internet-intensive undergraduate instruction at Miami University (Ohio) in which students learn by creating online materials themselves; faculty facilitate active learning; student intellectual exchanges are enriched; and the seminar sensibility is extended. Four applications are examined: a…

  10. Quantitative genetic models of sexual selection by male choice.

    Science.gov (United States)

    Nakahashi, Wataru

    2008-09-01

    There are many examples of male mate choice for female traits that tend to be associated with high fertility. I develop quantitative genetic models of a female trait and a male preference to show when such a male preference can evolve. I find that a disagreement between the fertility maximum and the viability maximum of the female trait is necessary for directional male preference (preference for extreme female trait values) to evolve. Moreover, when there is a shortage of available male partners or variance in male nongenetic quality, strong male preference can evolve. Furthermore, I also show that males evolve to exhibit a stronger preference for females that are more feminine (less resemblance to males) than the average female when there is a sexual dimorphism caused by fertility selection which acts only on females.

  11. Universality in stochastic exponential growth.

    Science.gov (United States)

    Iyer-Biswas, Srividya; Crooks, Gavin E; Scherer, Norbert F; Dinner, Aaron R

    2014-07-11

    Recent imaging data for single bacterial cells reveal that their mean sizes grow exponentially in time and that their size distributions collapse to a single curve when rescaled by their means. An analogous result holds for the division-time distributions. A model is needed to delineate the minimal requirements for these scaling behaviors. We formulate a microscopic theory of stochastic exponential growth as a Master Equation that accounts for these observations, in contrast to existing quantitative models of stochastic exponential growth (e.g., the Black-Scholes equation or geometric Brownian motion). Our model, the stochastic Hinshelwood cycle (SHC), is an autocatalytic reaction cycle in which each molecular species catalyzes the production of the next. By finding exact analytical solutions to the SHC and the corresponding first passage time problem, we uncover universal signatures of fluctuations in exponential growth and division. The model makes minimal assumptions, and we describe how more complex reaction networks can reduce to such a cycle. We thus expect similar scalings to be discovered in stochastic processes resulting in exponential growth that appear in diverse contexts such as cosmology, finance, technology, and population growth.

  12. Violations of universality in a vectorlike extension of the standard model

    International Nuclear Information System (INIS)

    Montvay, I.

    1996-04-01

    Violations of universality of couplings in a vectorlike extension of the standard model with three heavy mirror fermion families are considered. The recently observed discrepancies betwen experiments and the standard model in the hadronic branching fractions R b and R c of the Z-boson are explained by the mixing of fermions with their mirror fermion partners. (orig.)

  13. Development of a universal dual-bolus injection scheme for the quantitative assessment of myocardial perfusion cardiovascular magnetic resonance

    Directory of Open Access Journals (Sweden)

    Alfakih Khaled

    2011-05-01

    Full Text Available Abstract Background The dual-bolus protocol enables accurate quantification of myocardial blood flow (MBF by first-pass perfusion cardiovascular magnetic resonance (CMR. However, despite the advantages and increasing demand for the dual-bolus method for accurate quantification of MBF, thus far, it has not been widely used in the field of quantitative perfusion CMR. The main reasons for this are that the setup for the dual-bolus method is complex and requires a state-of-the-art injector and there is also a lack of post processing software. As a solution to one of these problems, we have devised a universal dual-bolus injection scheme for use in a clinical setting. The purpose of this study is to show the setup and feasibility of the universal dual-bolus injection scheme. Methods The universal dual-bolus injection scheme was tested using multiple combinations of different contrast agents, contrast agent dose, power injectors, perfusion sequences, and CMR scanners. This included 3 different contrast agents (Gd-DO3A-butrol, Gd-DTPA and Gd-DOTA, 4 different doses (0.025 mmol/kg, 0.05 mmol/kg, 0.075 mmol/kg and 0.1 mmol/kg, 2 different types of injectors (with and without "pause" function, 5 different sequences (turbo field echo (TFE, balanced TFE, k-space and time (k-t accelerated TFE, k-t accelerated balanced TFE, turbo fast low-angle shot and 3 different CMR scanners from 2 different manufacturers. The relation between the time width of dilute contrast agent bolus curve and cardiac output was obtained to determine the optimal predefined pause duration between dilute and neat contrast agent injection. Results 161 dual-bolus perfusion scans were performed. Three non-injector-related technical errors were observed (1.9%. No injector-related errors were observed. The dual-bolus scheme worked well in all the combinations of parameters if the optimal predefined pause was used. Linear regression analysis showed that the optimal duration for the predefined

  14. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  15. Developing a discrete event simulation model for university student shuttle buses

    Science.gov (United States)

    Zulkepli, Jafri; Khalid, Ruzelan; Nawawi, Mohd Kamal Mohd; Hamid, Muhammad Hafizan

    2017-11-01

    Providing shuttle buses for university students to attend their classes is crucial, especially when their number is large and the distances between their classes and residential halls are far. These factors, in addition to the non-optimal current bus services, typically require the students to wait longer which eventually opens a space for them to complain. To considerably reduce the waiting time, providing the optimal number of buses to transport them from location to location and the effective route schedules to fulfil the students' demand at relevant time ranges are thus important. The optimal bus number and schedules are to be determined and tested using a flexible decision platform. This paper thus models the current services of student shuttle buses in a university using a Discrete Event Simulation approach. The model can flexibly simulate whatever changes configured to the current system and report its effects to the performance measures. How the model was conceptualized and formulated for future system configurations are the main interest of this paper.

  16. The Development of an Intelligent Leadership Model for State Universities

    OpenAIRE

    Aleme Keikha; Reza Hoveida; Nour Mohammad Yaghoubi

    2017-01-01

    Higher education and intelligent leadership are considered important parts of every country’s education system, which could potentially play a key role in accomplishing the goals of society. In theories of leadership, new patterns attempt to view leadership through the prism of creative and intelligent phenomena. This paper aims to design and develop an intelligent leadership model for public universities. A qualitativequantitative research method was used to design a basic model of intellige...

  17. University Satellite Campus Management Models

    Science.gov (United States)

    Fraser, Doug; Stott, Ken

    2015-01-01

    Among the 60 or so university satellite campuses in Australia are many that are probably failing to meet the high expectations of their universities and the communities they were designed to serve. While in some cases this may be due to the demand driven system, it may also be attributable in part to the ways in which they are managed. The…

  18. Developing a Model of Tuition Fee Calculation for Universities of Medical Sciences

    Directory of Open Access Journals (Sweden)

    Seyed Amir Mohsen Ziaee

    2018-01-01

    Full Text Available Background: The aim of our study was to introduce and evaluate a practicable model for tuition fee calculation of each medical field in universities of medical sciences in Iran.Methods: Fifty experts in 11 panels were interviewed to identify variables that affect tuition fee calculation. This led to key points including total budgets, expenses of the universities, different fields’ attractiveness, universities’ attractiveness, and education quality. Tuition fees were calculated for different levels of education, such as post-diploma, Bachelor, Master, and Doctor of Philosophy (Ph.D degrees, Medical specialty, and Fellowship. After tuition fee calculation, the model was tested during 2013-2015. Since then, a questionnaire including 20 questions was prepared. All Universities’ financial and educational managers were asked to respond to the questions regarding the model’s reliability and effectiveness.Results: According to the results, fields’ attractiveness, universities’ attractiveness, zone distinction and education quality were selected as effective variables for tuition fee calculation. In this model, tuition fees per student were calculated for the year 2013, and, therefore, the inflation rate of the same year was used. Testing of the model showed that there is a 92% of satisfaction. This model is used by medical science universities in Iran.Conclusion: Education quality, zone coefficient, fields’ attractiveness, universities’ attractiveness, inflation rate, and portion of each level of education were the most important variables affecting tuition fee calculation.Keywords: TUITION FEES, FIELD’S ATTRACTIVENESS, UNIVERSITIES’ ATTRACTIVENESS, ZONE DISTINCTION, EDUCATION QUALITY

  19. Quantitative Literacy Interventions at University of Cape Town: Effects of Separation from Academic Disciplines

    Directory of Open Access Journals (Sweden)

    Vera Frith

    2012-01-01

    Full Text Available The aim of the Numeracy Centre at the University of Cape Town is to develop students’ quantitative literacy (QL in a manner consistent with their programmes of study and intended roles in the community. Our theoretical perspective on the nature of QL is in line with that of the New Literacies Studies and sees academic QL as practices in different academic disciplinary contexts. This means that for us the ideal curriculum structure for developing QL would fully integrate it into the teaching of the disciplines. This is in practice not achievable in most cases, especially since many students do not have the necessary foundations of mathematical and statistical knowledge and skills. The unavoidable deviation from the ideal curriculum structure presents challenges to the design of QL interventions. Two illustrative examples which display different degrees of separation from the disciplinary teaching are described and discussed. This discussion is based on lecturers’ reflections on the teaching experience and on student evaluations. The ‘stand-alone’ QL course for Humanities and Law students, which uses a context-based approach, is the least integrated with the disciplinary curriculum, and presents challenges in terms of tensions in the classroom between the contexts and the mathematical and statistical content, as well as challenges in terms of student motivation. The QL intervention for medical students is more closely integrated into the medical curriculum and presents fewer challenges. Both interventions are intended to provide ‘foundations’ in terms of QL and suffer from difficulties in providing students with authentic motivation.

  20. Use of a plant level logic model for quantitative assessment of systems interactions

    International Nuclear Information System (INIS)

    Chu, B.B.; Rees, D.C.; Kripps, L.P.; Hunt, R.N.; Bradley, M.

    1985-01-01

    The Electric Power Research Institute (EPRI) has sponsored a research program to investigate methods for identifying systems interactions (SIs) and for the evaluation of their importance. Phase 1 of the EPRI research project focused on the evaluation of methods for identification of SIs. Major results of the Phase 1 activities are the documentation of four different methodologies for identification of potential SIs and development of guidelines for performing an effective plant walkdown in support of an SI analysis. Phase II of the project, currently being performed, is utilizing a plant level logic model of a pressurized water reactor (PWR) to determine the quantitative importance of identified SIs. In Phase II, previously reported events involving interactions between systems were screened and selected on the basis of their relevance to the Baltimore Gas and Electric (BGandE) Calvert Cliffs Nuclear Power Plant design and perceived potential safety significance. Selected events were then incorporated into the BGandE plant level GO logic model. The model is being exercised to calculate the relative importance of these events. Five previously identified event scenarios, extracted from licensee event reports (LERs) are being evaluated during the course of the study. A key feature of the approach being used in Phase II is the use of a logic model in a manner to effectively evaluate the impact of events on the system level and the plant level for the mitigation of transients. Preliminary study results indicate that the developed methodology can be a viable and effective means for determining the quantitative significance of SIs

  1. A quantitative quasispecies theory-based model of virus escape mutation under immune selection.

    Science.gov (United States)

    Woo, Hyung-June; Reifman, Jaques

    2012-08-07

    Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although fundamental aspects of such a balance of mutation and selection pressure have been established by the quasispecies theory decades ago, its implications have largely remained qualitative. Here, we present a quantitative approach to model the virus evolution under cytotoxic T-lymphocyte immune response. The virus quasispecies dynamics are explicitly represented by mutations in the combined sequence space of a set of epitopes within the viral genome. We stochastically simulated the growth of a viral population originating from a single wild-type founder virus and its recognition and clearance by the immune response, as well as the expansion of its genetic diversity. Applied to the immune escape of a simian immunodeficiency virus epitope, model predictions were quantitatively comparable to the experimental data. Within the model parameter space, we found two qualitatively different regimes of infectious disease pathogenesis, each representing alternative fates of the immune response: It can clear the infection in finite time or eventually be overwhelmed by viral growth and escape mutation. The latter regime exhibits the characteristic disease progression pattern of human immunodeficiency virus, while the former is bounded by maximum mutation rates that can be suppressed by the immune response. Our results demonstrate that, by explicitly representing epitope mutations and thus providing a genotype-phenotype map, the quasispecies theory can form the basis of a detailed sequence-specific model of real-world viral pathogens evolving under immune selection.

  2. Comparison on information-seeking behavior of postgraduated students in Isfahan University of Medical Sciences and University of Isfahan in writing dissertation based on Kuhlthau model of information search process.

    Science.gov (United States)

    Abedi, Mahnaz; Ashrafi-Rizi, Hasan; Zare-Farashbandi, Firoozeh; Nouri, Rasoul; Hassanzadeh, Akbar

    2014-01-01

    Information-seeking behaviors have been one of the main focuses of researchers in order to identify and solve the problems users face in information recovery. The aim of this research is Comparative on Information-Seeking Behavior of the Postgraduate Students in Isfahan University of Medical Sciences and Isfahan University in Writing Dissertation based on Kuhlthau Model of Information Search Process in 2012. The research method followed is survey and the data collection tool is Narmenji questionnaire. Statistical population was all postgraduate students in Isfahan University of Medical Sciences and Isfahan University. The sample size was 196 people and sampling was stratified randomly. The type of statistical analyses were descriptive (mean and frequency) and inferential (independent t test and Pearson's correlation) and the software used was SPSS20. The findings showed that Isfahan Medical Sciences University followed 20% of the order steps of this model and Isfahan University did not follow this model. In the first stage (Initiation) and sixth (Presentation) of feelings aspects and in actions (total stages) significant difference was found between students from the two universities. Between gender and fourth stage (Formulation) and the total score of feelings the Kuhlthau model there has a significant relationship. Also there was a significant and inverse relationship between the third stage (Exploration) of feelings and age of the students. The results showed that in writing dissertation there were some major differences in following up the Kuhlthau model between students of the two Universities. There are significant differences between some of the stages of feelings and actions of students' information-seeking behavior from the two universities. There is a significant relationship between the fourth stage (Formulation) of feelings in the Kuhlthau Model with gender and third stage of the Feelings (Exploration) with age.

  3. Foundations for quantitative microstructural models to track evolution of the metallurgical state during high purity Nb cavity fabrication

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, Thomas R [Michigan State University; Wright, Neil T [Michigan State University; Compton, Chris C [Facility for Rare Isotope Beams

    2014-03-15

    The goal of the Materials Science SRF Cavity Group of Michigan State University and the National Superconducting Cyclotron has been (and continues to be) to understand quantitatively the effects of process history on functional properties. These relationships were assessed via studies on Nb samples and cavity parts, which had various combinations of forming processes, welding, heat treatments, and surface preparation. A primary focus was on large-grain cavity building strategies. Effects of processing operations and exposure to hydrogen on the thermal conductivity has been identified in single and bi-crystal samples, showing that the thermal conductivity can be altered by a factor of 5 depending on process history. Characterization of single crystal tensile samples show a strong effect of crystal orientation on deformation resistance and shape changes. Large grain half cells were examined to characterize defect content and surface damage effects, which provided quantitative information about the depth damage layers from forming.

  4. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  5. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  6. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  7. Exact solutions, finite time singularities and non-singular universe models from a variety of Λ(t) cosmologies

    Science.gov (United States)

    Pan, Supriya

    2018-01-01

    Cosmological models with time-dependent Λ (read as Λ(t)) have been investigated widely in the literature. Models that solve background dynamics analytically are of special interest. Additionally, the allowance of past or future singularities at finite cosmic time in a specific model signals for a generic test on its viabilities with the current observations. Following these, in this work we consider a variety of Λ(t) models focusing on their evolutions and singular behavior. We found that a series of models in this class can be exactly solved when the background universe is described by a spatially flat Friedmann-Lemaître-Robertson-Walker (FLRW) line element. The solutions in terms of the scale factor of the FLRW universe offer different universe models, such as power-law expansion, oscillating, and the singularity free universe. However, we also noticed that a large number of the models in this series permit past or future cosmological singularities at finite cosmic time. At last we close the work with a note that the avoidance of future singularities is possible for certain models under some specific restrictions.

  8. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model. Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used. Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase. Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  9. Correlations of 3T DCE-MRI Quantitative Parameters with Microvessel Density in a Human-Colorectal-Cancer Xenograft Mouse Model

    International Nuclear Information System (INIS)

    Ahn, Sung Jun; An, Chan Sik; Koom, Woong Sub; Song, Ho Taek; Suh, Jin Suck

    2011-01-01

    To investigate the correlation between quantitative dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) parameters and microvascular density (MVD) in a human-colon-cancer xenograft mouse model using 3 Tesla MRI. A human-colon-cancer xenograft model was produced by subcutaneously inoculating 1 X 106 DLD-1 human-colon-cancer cells into the right hind limbs of 10 mice. The tumors were allowed to grow for two weeks and then assessed using MRI. DCE-MRI was performed by tail vein injection of 0.3 mmol/kg of gadolinium. A region of interest (ROI) was drawn at the midpoints along the z-axes of the tumors, and a Tofts model analysis was performed. The quantitative parameters (Ktrans, Kep and Ve) from the whole transverse ROI and the hotspot ROI of the tumor were calculated. Immunohistochemical microvessel staining was performed and analyzed according to Weidner's criteria at the corresponding MRI sections. Additional Hematoxylin and Eosin staining was performed to evaluate tumor necrosis. The Mann-Whitney test and Spearman's rho correlation analysis were performed to prove the existence of a correlation between the quantitative parameters, necrosis, and MVD. Whole transverse ROI of the tumor showed no significant relationship between the MVD values and quantitative DCE-MRI parameters. In the hotspot ROI, there was a difference in MVD between low and high group of Ktrans and Kep that had marginally statistical significance (ps = 0.06 and 0.07, respectively). Also, Ktrans and Kep were found to have an inverse relationship with MVD (r -0.61, p = 0.06 in Ktrans; r = -0.60, p = 0.07 in Kep). Quantitative analysis of T1-weighted DCE-MRI using hotspot ROI may provide a better histologic match than whole transverse section ROI. Within the hotspots, Ktrans and Kep tend to have a reverse correlation with MVD in this colon cancer mouse model.

  10. Correlations of 3T DCE-MRI Quantitative Parameters with Microvessel Density in a Human-Colorectal-Cancer Xenograft Mouse Model

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sung Jun; An, Chan Sik; Koom, Woong Sub; Song, Ho Taek; Suh, Jin Suck [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2011-11-15

    To investigate the correlation between quantitative dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) parameters and microvascular density (MVD) in a human-colon-cancer xenograft mouse model using 3 Tesla MRI. A human-colon-cancer xenograft model was produced by subcutaneously inoculating 1 X 106 DLD-1 human-colon-cancer cells into the right hind limbs of 10 mice. The tumors were allowed to grow for two weeks and then assessed using MRI. DCE-MRI was performed by tail vein injection of 0.3 mmol/kg of gadolinium. A region of interest (ROI) was drawn at the midpoints along the z-axes of the tumors, and a Tofts model analysis was performed. The quantitative parameters (Ktrans, Kep and Ve) from the whole transverse ROI and the hotspot ROI of the tumor were calculated. Immunohistochemical microvessel staining was performed and analyzed according to Weidner's criteria at the corresponding MRI sections. Additional Hematoxylin and Eosin staining was performed to evaluate tumor necrosis. The Mann-Whitney test and Spearman's rho correlation analysis were performed to prove the existence of a correlation between the quantitative parameters, necrosis, and MVD. Whole transverse ROI of the tumor showed no significant relationship between the MVD values and quantitative DCE-MRI parameters. In the hotspot ROI, there was a difference in MVD between low and high group of Ktrans and Kep that had marginally statistical significance (ps = 0.06 and 0.07, respectively). Also, Ktrans and Kep were found to have an inverse relationship with MVD (r -0.61, p = 0.06 in Ktrans; r = -0.60, p = 0.07 in Kep). Quantitative analysis of T1-weighted DCE-MRI using hotspot ROI may provide a better histologic match than whole transverse section ROI. Within the hotspots, Ktrans and Kep tend to have a reverse correlation with MVD in this colon cancer mouse model.

  11. A Universal Model for the Normative Evaluation of Internet Information.

    NARCIS (Netherlands)

    Spence, E.H.

    2009-01-01

    Beginning with the initial premise that as the Internet has a global character, the paper will argue that the normative evaluation of digital information on the Internet necessitates an evaluative model that is itself universal and global in character (I agree, therefore, with Gorniak- Kocikowska’s

  12. Stochastic dynamics of an inflationary model and initial distribution of universes

    International Nuclear Information System (INIS)

    Nambu, Yasusada.

    1989-01-01

    We investigate the stationary solution of the modified Fokker-Planck equation which governs the global dynamics of the inflation. Contrary to the original FP equation which is for a Hubble horizon size region, we found that the normalizable stationary solution can exist for modified Fokker-Planck equation which is for many Hubble horizon size regions. For a chaotic inflationary model with the potential λψ 2n , we get initial distribution of classical universes using this solution, and discussed the physical meaning of it. Especially for n = 2, this distribution obeys power-law and classical universes which created from the Planck energy region make the fractal structure. Other cases n ≠ 2, creation of large classical universes are strongly suppressed. (author)

  13. Universal model for water costs of gas exchange by animals and plants.

    Science.gov (United States)

    Woods, H Arthur; Smith, Jennifer N

    2010-05-04

    For terrestrial animals and plants, a fundamental cost of living is water vapor lost to the atmosphere during exchange of metabolic gases. Here, by bringing together previously developed models for specific taxa, we integrate properties common to all terrestrial gas exchangers into a universal model of water loss. The model predicts that water loss scales to gas exchange with an exponent of 1 and that the amount of water lost per unit of gas exchanged depends on several factors: the surface temperature of the respiratory system near the outside of the organism, the gas consumed (oxygen or carbon dioxide), the steepness of the gradients for gas and vapor, and the transport mode (convective or diffusive). Model predictions were largely confirmed by data on 202 species in five taxa--insects, birds, bird eggs, mammals, and plants--spanning nine orders of magnitude in rate of gas exchange. Discrepancies between model predictions and data seemed to arise from biologically interesting violations of model assumptions, which emphasizes how poorly we understand gas exchange in some taxa. The universal model provides a unified conceptual framework for analyzing exchange-associated water losses across taxa with radically different metabolic and exchange systems.

  14. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  15. Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers

    Science.gov (United States)

    Kowalski, Benjamin Andrew

    Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.

  16. The Comparison of Distributed P2P Trust Models Based on Quantitative Parameters in the File Downloading Scenarios

    Directory of Open Access Journals (Sweden)

    Jingpei Wang

    2016-01-01

    Full Text Available Varied P2P trust models have been proposed recently; it is necessary to develop an effective method to evaluate these trust models to resolve the commonalities (guiding the newly generated trust models in theory and individuality (assisting a decision maker in choosing an optimal trust model to implement in specific context issues. A new method for analyzing and comparing P2P trust models based on hierarchical parameters quantization in the file downloading scenarios is proposed in this paper. Several parameters are extracted from the functional attributes and quality feature of trust relationship, as well as requirements from the specific network context and the evaluators. Several distributed P2P trust models are analyzed quantitatively with extracted parameters modeled into a hierarchical model. The fuzzy inferring method is applied to the hierarchical modeling of parameters to fuse the evaluated values of the candidate trust models, and then the relative optimal one is selected based on the sorted overall quantitative values. Finally, analyses and simulation are performed. The results show that the proposed method is reasonable and effective compared with the previous algorithms.

  17. Universal squash model for optical communications using linear optics and threshold detectors

    International Nuclear Information System (INIS)

    Fung, Chi-Hang Fred; Chau, H. F.; Lo, Hoi-Kwong

    2011-01-01

    Transmission of photons through open-air or optical fibers is an important primitive in quantum-information processing. Theoretical descriptions of this process often consider single photons as information carriers and thus fail to accurately describe experimental implementations where any number of photons may enter a detector. It has been a great challenge to bridge this big gap between theory and experiments. One powerful method for achieving this goal is by conceptually squashing the received multiphoton states to single-photon states. However, until now, only a few protocols admit a squash model; furthermore, a recently proven no-go theorem appears to rule out the existence of a universal squash model. Here we show that a necessary condition presumed by all existing squash models is in fact too stringent. By relaxing this condition, we find that, rather surprisingly, a universal squash model actually exists for many protocols, including quantum key distribution, quantum state tomography, Bell's inequality testing, and entanglement verification.

  18. Formation of a ''child'' universe in an inflationary cosmological model

    International Nuclear Information System (INIS)

    Holcomb, K.A.; Park, S.J.; Vishniac, E.T.

    1989-01-01

    The evolution of a flat, spherically symmetric cosmological model, containing radiation and an inhomogeneous scalar field, is simulated numerically to determine whether the inhomogeneity could cause a ''child'' universe, connected by a wormhole to the external universe, to form. The gravitational and field quantities were computed self-consistently by means of the techniques of numerical relativity. Although we were unable to follow the process to its completion, preliminary indications are that the ''budding'' phenomenon could occur under very general initial conditions, as long as the scalar field is sufficiently inhomogeneous that the wormhole forms before the inflation is damped by the expansion of the background spacetime

  19. A quantitative microbial risk assessment model for Listeria monocytogenes in RTE sandwiches

    DEFF Research Database (Denmark)

    Tirloni, E.; Stella, S.; de Knegt, Leonardo

    2018-01-01

    within each serving. Then, two dose-response models were alternatively applied: the first used a fixed r value for each of the three population groups, while the second considered a variable r value (lognormal distribution), taking into account the variability in strain virulence and different host...... subpopulations susceptibility. The stochastic model predicted zero cases for total population for both the substrates by using the fixed r approach, while 3 cases were expected when a higher variability (in virulence and susceptibility) was considered in the model; the number of cases increased to 45......A Quantitative Microbial Risk Assessment (QMRA) was performed to estimate the expected number of listeriosis cases due to the consumption, on the last day of shelf life, of 20 000 servings of multi-ingredient sandwiches produced by a medium scale food producer in Italy, by different population...

  20. Quantitative Activities for Introductory Astronomy

    Science.gov (United States)

    Keohane, Jonathan W.; Bartlett, J. L.; Foy, J. P.

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: "The Mass of Neptune” "The Temperature on Titan” "Rocks in the Early Solar System” "Comets Hitting Planets” "Ages of Meteorites” "How Flat are Saturn's Rings?” "Tides of the Sun and Moon on the Earth” "The Gliese 581 Solar System"; "Buckets in the Rain” "How Hot, Bright and Big is Betelgeuse?” "Bombs and the Sun” "What Forms Stars?” "Lifetimes of Cars and Stars” "The Mass of the Milky” "How Old is the Universe?” "Is The Universe Speeding up or Slowing Down?"

  1. Rapid and quantitative detection of C-reactive protein based on quantum dots and immunofiltration assay

    Directory of Open Access Journals (Sweden)

    Zhang PF

    2015-09-01

    Full Text Available Pengfei Zhang,1,* Yan Bao,1,* Mohamed Shehata Draz,2,3,* Huiqi Lu,1 Chang Liu,1 Huanxing Han11Center for Translational Medicine, Changzheng Hospital, Second Military Medical University, Shanghai, People’s Republic of China; 2Zhejiang-California International Nanosystems Institute, Zhejiang University, Hangzhou, Zhejiang, People’s Republic of China; 3Faculty of Science, Tanta University, Tanta, Egypt*These authors contributed equally to this workAbstract: Convenient and rapid immunofiltration assays (IFAs enable on-site “yes” or “no” determination of disease markers. However, traditional IFAs are commonly qualitative or semi-quantitative and are very limited for the efficient testing of samples in field diagnostics. Here, we overcome these limitations by developing a quantum dots (QDs-based fluorescent IFA for the quantitative detection of C-reactive proteins (CRP. CRP, the well-known diagnostic marker for acute viral and bacterial infections, was used as a model analyte to demonstrate performance and sensitivity of our developed QDs-based IFA. QDs capped with both polyethylene glycol (PEG and glutathione were used as fluorescent labels for our IFAs. The presence of the surface PEG layer, which reduced the non-specific protein interactions, in conjunction with the inherent optical properties of QDs, resulted in lower background signal, increased sensitivity, and ability to detect CRP down to 0.79 mg/L with only 5 µL serum sample. In addition, the developed assay is simple, fast and can quantitatively detect CRP with a detection limit up to 200 mg/L. Clinical test results of our QD-based IFA are well correlated with the traditional latex enhance immune-agglutination aggregation. The proposed QD-based fluorescent IFA is very promising, and potentially will be adopted for multiplexed immunoassay and in field point-of-care test.Keywords: C-reactive proteins, point-of-care test, Glutathione capped QDs, PEGylation

  2. THE MODEL OF LIFELONG EDUCATION IN A TECHNICAL UNIVERSITY AS A MULTILEVEL EDUCATIONAL COMPLEX

    Directory of Open Access Journals (Sweden)

    Svetlana V. Sergeyeva

    2016-06-01

    Full Text Available Introduction: the current leading trend of the educational development is characterised by its continuity. Institutions of higher education as multi-level educational complexes nurture favourable conditions for realisation of the strategy of lifelong education. Today a technical university offering training of future engineers is facing a topic issue of creating a multilevel educational complex. Materials and Methods: this paper is put together on the basis of modern Russian and foreign scientific literature about lifelong education. The authors used theoretical methods of scientific research: systemstructural analysis, synthesis, modeling, analysis and generalisations of concepts. Results: the paper presents a model of lifelong education developed by authors for a technical university as a multilevel educational complex. It is realised through a set of principles: multi-level and continuity, integration, conformity and quality, mobility, anticipation, openness, social partnership and feedback. In accordance with the purpose, objectives and principles, the content part of the model is formed. The syllabi following the described model are run in accordance with the training levels undertaken by a technical university as a multilevel educational complex. All syllabi are based on the gradual nature of their implementation. In this regard, the authors highlight three phases: diagnostic, constructive and transformative, assessing. Discussion and Conclusions: the expected result of the created model of lifelong education development in a technical university as a multilevel educational complex is presented by a graduate trained for effective professional activity, competitive, prepared and sought-after at the regional labour market.

  3. Study of the quantitative analysis approach of maintenance by the Monte Carlo simulation method

    International Nuclear Information System (INIS)

    Shimizu, Takashi

    2007-01-01

    This study is examination of the quantitative valuation by Monte Carlo simulation method of maintenance activities of a nuclear power plant. Therefore, the concept of the quantitative valuation of maintenance that examination was advanced in the Japan Society of Maintenology and International Institute of Universality (IUU) was arranged. Basis examination for quantitative valuation of maintenance was carried out at simple feed water system, by Monte Carlo simulation method. (author)

  4. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong

    2013-01-01

    Carbonate matrix acidization extends a well\\'s effective drainage radius by dissolving rock and forming conductive channels (wormholes) from the wellbore. Wormholing is a dynamic process that involves balance between the acid injection rate and reaction rate. Generally, injection rate is well defined where injection profiles can be controlled, whereas the reaction rate can be difficult to obtain due to its complex dependency on interstitial velocity, fluid composition, rock surface properties etc. Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  5. Compact baby universe model in ten dimension and probability function of quantum gravity

    International Nuclear Information System (INIS)

    Yan Jun; Hu Shike

    1991-01-01

    The quantum probability functions are calculated for ten-dimensional compact baby universe model. The authors find that the probability for the Yang-Mills baby universe to undergo a spontaneous compactification down to a four-dimensional spacetime is greater than that to remain in the original homogeneous multidimensional state. Some questions about large-wormhole catastrophe are also discussed

  6. Microscopic universality of complex matrix model correlation functions at weak non-Hermiticity

    International Nuclear Information System (INIS)

    Akemann, G.

    2002-01-01

    The microscopic correlation functions of non-chiral random matrix models with complex eigenvalues are analyzed for a wide class of non-Gaussian measures. In the large-N limit of weak non-Hermiticity, where N is the size of the complex matrices, we can prove that all k-point correlation functions including an arbitrary number of Dirac mass terms are universal close to the origin. To this aim we establish the universality of the asymptotics of orthogonal polynomials in the complex plane. The universality of the correlation functions then follows from that of the kernel of orthogonal polynomials and a mapping of massive to massless correlators

  7. Fuzzy Universal Model Approximator for Distributed Solar Collector Field Control

    KAUST Repository

    Elmetennani, Shahrazed

    2014-07-01

    This paper deals with the control of concentrating parabolic solar collectors by forcing the outlet oil temperature to track a set reference. A fuzzy universal approximate model is introduced in order to accurately reproduce the behavior of the system dynamics. The proposed model is a low order state space representation derived from the partial differential equation describing the oil temperature evolution using fuzzy transform theory. The resulting set of ordinary differential equations simplifies the system analysis and the control law design and is suitable for real time control implementation. Simulation results show good performance of the proposed model.

  8. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  9. New management models in universities and in teaching work in Colombia

    Directory of Open Access Journals (Sweden)

    Omar Cabrales Salazar

    2018-03-01

    Full Text Available This article defines the theoretical foundations of organizational models that have been implemented in Colombian universities in recent years. The discussion is based on a contemporary perspective of the educational system, called Cognitive or Academic Capitalism, initially introduced by Slaughter and Leslie (1997 to call the new investigative dynamics incorporated from companies to universities, as a new valuation option of the capital represented by research and innovation, so that the generated within them, can be commercialized and susceptible to alienation more easily in the market.

  10. Virtual Models of European Universities

    DEFF Research Database (Denmark)

    Pedersen, Sanya Gertsen

    2003-01-01

    The study provides a detailed report on the current and possible future use of ICT by European universities for educational and organisational purposes. The report presents: • A general description of the current situation regarding the use of ICT in EU universities in both the educational...... and the organisational setting. • An in-depth study of selected institutions through case studies. • A future-oriented analysis. • A set of recommendations for future action....

  11. Does Environmental Sustainability Play a Role in the Adoption of Smart Card Technology at Universities in Taiwan: An Integration of TAM and TRA

    Directory of Open Access Journals (Sweden)

    Ching-Wei Ho

    2015-08-01

    Full Text Available Smart cards are able to store and protect relatively large amounts of data. When applied in universities, they can act as multi-purpose, multi-function and smart ID cards. This would avoid the waste of resources and maintain environmental sustainability. This study proposes a model that integrates Technology Acceptance Model and Theory of Reasoned Action into a framework incorporating the notion of environmental concern in order to explore the factors that affect students’ behavioral intention to use University Smart Cards. This study employs a quantitative method for primary data collection via a structured questionnaire for university students. The findings indicated that the perceived usefulness and subjective norm of university smart card systems have the most significant predictive power on potential users’ attitudes and intentions of adopting the card.

  12. Perceptions of Students towards ICT Competencies at the University

    Science.gov (United States)

    Torres-Gastelú, Carlos Arturo; Kiss, Gábor

    2016-01-01

    The purpose of this study is to identify the perceptions of university students towards their ICT Competencies from two universities, one in Mexico and the other in Hungary. The research type is quantitative and exploratory. The instrument consists of 14 questions related to three types of competencies: Basic, Application and Ethical. The sample…

  13. Virtual Attendance: Analysis of an Audiovisual over IP System for Distance Learning in the Spanish Open University (UNED

    Directory of Open Access Journals (Sweden)

    Esteban Vázquez-Cano

    2013-07-01

    Full Text Available This article analyzes a system of virtual attendance, called “AVIP” (AudioVisual over Internet Protocol, at the Spanish Open University (UNED in Spain. UNED, the largest open university in Europe, is the pioneer in distance education in Spain. It currently has more than 300,000 students, 1,300 teachers, and 6,000 tutors all over the world, besides Spain. This university is redefining, on the lines of other universities, many of its academic processes to meet the new requirements of the European Higher Education Area (EHEA. Since its inception, more than 30 years ago, the methodology chosen by UNED has been blended learning. Today, this university combines face-to-face tutorial sessions with new methodological proposals, mediated by ICT. Through a quantitative methodology, the perception of students and tutors of the new model of virtual tutoring, called AVIP Classrooms, was analyzed. The results show that the new model greatly improves the orientation and teaching methodology of tutors. However, it requires training and new approaches to provide a more collaborative and participatory environment for students.

  14. Factors that influence utilisation of HIV/AIDS prevention methods among university students residing at a selected university campus.

    Science.gov (United States)

    Ndabarora, Eléazar; Mchunu, Gugu

    2014-01-01

    Various studies have reported that university students, who are mostly young people, rarely use existing HIV/AIDS preventive methods. Although studies have shown that young university students have a high degree of knowledge about HIV/AIDS and HIV modes of transmission, they are still not utilising the existing HIV prevention methods and still engage in risky sexual practices favourable to HIV. Some variables, such as awareness of existing HIV/AIDS prevention methods, have been associated with utilisation of such methods. The study aimed to explore factors that influence use of existing HIV/AIDS prevention methods among university students residing in a selected campus, using the Health Belief Model (HBM) as a theoretical framework. A quantitative research approach and an exploratory-descriptive design were used to describe perceived factors that influence utilisation by university students of HIV/AIDS prevention methods. A total of 335 students completed online and manual questionnaires. Study findings showed that the factors which influenced utilisation of HIV/AIDS prevention methods were mainly determined by awareness of the existing university-based HIV/AIDS prevention strategies. Most utilised prevention methods were voluntary counselling and testing services and free condoms. Perceived susceptibility and perceived threat of HIV/AIDS score was also found to correlate with HIV risk index score. Perceived susceptibility and perceived threat of HIV/AIDS showed correlation with self-efficacy on condoms and their utilisation. Most HBM variables were not predictors of utilisation of HIV/AIDS prevention methods among students. Intervention aiming to improve the utilisation of HIV/AIDS prevention methods among students at the selected university should focus on removing identified barriers, promoting HIV/AIDS prevention services and providing appropriate resources to implement such programmes.

  15. A quantitative reading of competences documents of Law new degrees.

    OpenAIRE

    Leví Orta, Genoveva del Carmen; Ramos Méndez, Eduardo

    2014-01-01

    Documents formulating competences of degrees are key sources for analysis, evaluation and profile comparison of training, currently offered by different university degrees. This work aims to make a quantitative reading of competences documents of Law degree from various Spanish universities, based on the ideas of Content Analysis. The methodology has two phases. Firstly, a dictionary of concepts related to the components of competences is identified in the documentary corpus. Next, the corpus...

  16. Modelling Facebook Usage among University Students in Thailand: The Role of Emotional Attachment in an Extended Technology Acceptance Model

    Science.gov (United States)

    Teo, Timothy

    2016-01-01

    The aim of this study is to examine the factors that influenced the use of Facebook among university students. Using an extended technology acceptance model (TAM) with emotional attachment (EA) as an external variable, a sample of 498 students from a public-funded Thailand university were surveyed on their responses to five variables hypothesized…

  17. The relationship of document and quantitative literacy with learning styles and selected personal variables for aerospace technology students at Indiana State University

    Science.gov (United States)

    Martin, Royce Ann

    The purpose of this study was to determine the extent that student scores on a researcher-constructed quantitative and document literacy test, the Aviation Documents Delineator (ADD), were associated with (a) learning styles (imaginative, analytic, common sense, dynamic, and undetermined), as identified by the Learning Type Measure, (b) program curriculum (aerospace administration, professional pilot, both aerospace administration and professional pilot, other, or undeclared), (c) overall cumulative grade point average at Indiana State University, and (d) year in school (freshman, sophomore, junior, or senior). The Aviation Documents Delineator (ADD) was a three-part, 35 question survey that required students to interpret graphs, tables, and maps. Tasks assessed in the ADD included (a) locating, interpreting, and describing specific data displayed in the document, (b) determining data for a specified point on the table through interpolation, (c) comparing data for a string of variables representing one aspect of aircraft performance to another string of variables representing a different aspect of aircraft performance, (d) interpreting the documents to make decisions regarding emergency situations, and (e) performing single and/or sequential mathematical operations on a specified set of data. The Learning Type Measure (LTM) was a 15 item self-report survey developed by Bernice McCarthy (1995) to profile an individual's processing and perception tendencies in order to reveal different individual approaches to learning. The sample used in this study included 143 students enrolled in Aerospace Technology Department courses at Indiana State University in the fall of 1996. The ADD and the LTM were administered to each subject. Data collected in this investigation were analyzed using a stepwise multiple regression analysis technique. Results of the study revealed that the variables, year in school and GPA, were significant predictors of the criterion variables, document

  18. [Influence of Spectral Pre-Processing on PLS Quantitative Model of Detecting Cu in Navel Orange by LIBS].

    Science.gov (United States)

    Li, Wen-bing; Yao, Lin-tao; Liu, Mu-hua; Huang, Lin; Yao, Ming-yin; Chen, Tian-bing; He, Xiu-wen; Yang, Ping; Hu, Hui-qin; Nie, Jiang-hui

    2015-05-01

    Cu in navel orange was detected rapidly by laser-induced breakdown spectroscopy (LIBS) combined with partial least squares (PLS) for quantitative analysis, then the effect on the detection accuracy of the model with different spectral data ptetreatment methods was explored. Spectral data for the 52 Gannan navel orange samples were pretreated by different data smoothing, mean centralized and standard normal variable transform. Then 319~338 nm wavelength section containing characteristic spectral lines of Cu was selected to build PLS models, the main evaluation indexes of models such as regression coefficient (r), root mean square error of cross validation (RMSECV) and the root mean square error of prediction (RMSEP) were compared and analyzed. Three indicators of PLS model after 13 points smoothing and processing of the mean center were found reaching 0. 992 8, 3. 43 and 3. 4 respectively, the average relative error of prediction model is only 5. 55%, and in one word, the quality of calibration and prediction of this model are the best results. The results show that selecting the appropriate data pre-processing method, the prediction accuracy of PLS quantitative model of fruits and vegetables detected by LIBS can be improved effectively, providing a new method for fast and accurate detection of fruits and vegetables by LIBS.

  19. A Community-University Exchange Project Modeled after Europe's Science Shops

    Science.gov (United States)

    Tryon, Elizabeth; Ross, J. Ashleigh

    2012-01-01

    This article describes a pilot project of the Morgridge Center for Public Service at the University of Wisconsin-Madison for a new structure for community-based learning and research. It is based on the European-derived science shop model for democratizing campus-community partnerships using shared values of mutual respect and validation of…

  20. The University Model and Educational Change. SSEC Publication No. 130.

    Science.gov (United States)

    Ford, Richard B.

    In the sixties the crisis of the credibility and competence of schools resulted in the funding of programs to remedy school problems. The model for curriculum reform came from the university and, more particularly, from liberal arts departments having the capacity to improve curriculum content and teacher expertise. In a few instances attempts…

  1. A universal multilingual weightless neural network tagger via quantitative linguistics.

    Science.gov (United States)

    Carneiro, Hugo C C; Pedreira, Carlos E; França, Felipe M G; Lima, Priscila M V

    2017-07-01

    In the last decade, given the availability of corpora in several distinct languages, research on multilingual part-of-speech tagging started to grow. Amongst the novelties there is mWANN-Tagger (multilingual weightless artificial neural network tagger), a weightless neural part-of-speech tagger capable of being used for mostly-suffix-oriented languages. The tagger was subjected to corpora in eight languages of quite distinct natures and had a remarkable accuracy with very low sample deviation in every one of them, indicating the robustness of weightless neural systems for part-of-speech tagging tasks. However, mWANN-Tagger needed to be tuned for every new corpus, since each one required a different parameter configuration. For mWANN-Tagger to be truly multilingual, it should be usable for any new language with no need of parameter tuning. This article proposes a study that aims to find a relation between the lexical diversity of a language and the parameter configuration that would produce the best performing mWANN-Tagger instance. Preliminary analyses suggested that a single parameter configuration may be applied to the eight aforementioned languages. The mWANN-Tagger instance produced by this configuration was as accurate as the language-dependent ones obtained through tuning. Afterwards, the weightless neural tagger was further subjected to new corpora in languages that range from very isolating to polysynthetic ones. The best performing instances of mWANN-Tagger are again the ones produced by the universal parameter configuration. Hence, mWANN-Tagger can be applied to new corpora with no need of parameter tuning, making it a universal multilingual part-of-speech tagger. Further experiments with Universal Dependencies treebanks reveal that mWANN-Tagger may be extended and that it has potential to outperform most state-of-the-art part-of-speech taggers if better word representations are provided. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. A Quantitative bgl Operon Model for E. coli Requires BglF Conformational Change for Sugar Transport

    Science.gov (United States)

    Chopra, Paras; Bender, Andreas

    The bgl operon is responsible for the metabolism of β-glucoside sugars such as salicin or arbutin in E. coli. Its regulatory system involves both positive and negative feedback mechanisms and it can be assumed to be more complex than that of the more closely studied lac and trp operons. We have developed a quantitative model for the regulation of the bgl operon which is subject to in silico experiments investigating its behavior under different hypothetical conditions. Upon administration of 5mM salicin as an inducer our model shows 80-fold induction, which compares well with the 60-fold induction measured experimentally. Under practical conditions 5-10mM inducer are employed, which is in line with the minimum inducer concentration of 1mM required by our model. The necessity of BglF conformational change for sugar transport has been hypothesized previously, and in line with those hypotheses our model shows only minor induction if conformational change is not allowed. Overall, this first quantitative model for the bgl operon gives reasonable predictions that are close to experimental results (where measured). It will be further refined as values of the parameters are determined experimentally. The model was developed in Systems Biology Markup Language (SBML) and it is available from the authors and from the Biomodels repository [www.ebi.ac.uk/biomodels].

  3. Universe symmetries

    International Nuclear Information System (INIS)

    Souriau, J.M.

    1984-01-01

    The sky uniformity can be noticed in studying the repartition of objects far enough. The sky isotropy description uses space rotations. The group theory elements will allow to give a meaning at the same time precise and general to the word a ''symmetry''. Universe models are reviewed, which must have both of the following qualities: - conformity with the physic known laws; - rigorous symmetry following one of the permitted groups. Each of the models foresees that universe evolution obeys an evolution equation. Expansion and big-bang theory are recalled. Is universe an open or closed space. Universe is also electrically neutral. That leads to a work hypothesis: the existing matter is not given data of universe but it appeared by evolution from nothing. Problem of matter and antimatter is then raised up together with its place in universe [fr

  4. Quantitative analysis of CT brain images: a statistical model incorporating partial volume and beam hardening effects

    International Nuclear Information System (INIS)

    McLoughlin, R.F.; Ryan, M.V.; Heuston, P.M.; McCoy, C.T.; Masterson, J.B.

    1992-01-01

    The purpose of this study was to construct and evaluate a statistical model for the quantitative analysis of computed tomographic brain images. Data were derived from standard sections in 34 normal studies. A model representing the intercranial pure tissue and partial volume areas, with allowance for beam hardening, was developed. The average percentage error in estimation of areas, derived from phantom tests using the model, was 28.47%. We conclude that our model is not sufficiently accurate to be of clinical use, even though allowance was made for partial volume and beam hardening effects. (author)

  5. Reducing Math Anxiety: Findings from Incorporating Service Learning into a Quantitative Reasoning Course at Seattle University

    Directory of Open Access Journals (Sweden)

    Allison Henrich

    2011-07-01

    Full Text Available How might one teach mathematics to math-anxious students and at the same time reduce their math anxiety? This paper describes what we found when we incorporated a service learning component into a quantitative reasoning course at Seattle University in Fall 2010 (20 students and Spring 2011 (28 students. The course is taken primarily by humanities majors, many of whom would not take a course in math if they didn’t need to satisfy the university’s core requirement. For the service learning component, each student met with and tutored children at local schools for 1-2 hours per week (total about 15 service hours, kept a weekly journal reflecting on the experience, and wrote a five-page final paper on the importance and reasonable expectations of mathematics literacy. The autobiographies, self-description at the beginning of the class, focus group interviews at the end of the term, journal entries, final essays, and student evaluations indicated that the students gained confidence in their mathematical abilities, a greater interest in mathematics, and a broader sense of the importance of math literacy in modern society. One notable finding was that students discovered that the act of manufacturing enthusiasm about math as a tool for tutoring the children made them more enthusiastic about math in their own courses.

  6. The Learning University.

    Science.gov (United States)

    Patterson, Glenys

    1999-01-01

    As universities make cross-sectoral alliances, various models for integrating postsecondary education into universities arise: contract, brokerage, collaborative, validation, joint program, dual-sector institution, tertiary university, metaphoric, and federal. The integrated, comprehensive university is the learning university of the 21st century.…

  7. A Tuned Value Chain Model for University Based Public Research Organisation. Case Lut Cst.

    OpenAIRE

    Vesa Karvonen; Matti Karvonen; Andrzej Kraslawski

    2012-01-01

    The Porter´s value chain model was introduced for strategic business purposes. During the last decades also Universities and University based institutes have started to use actions similar to private business concepts. A University based institute is not independent actor like company but there are interest groups who are expecting them to act like they would be. This article discusses about the possibility of utilize tuned value chain to public research organizations (PRO). Also the interact...

  8. A Tuned Value Chain Model for University Based Public Research Organisation: Case Lut Cst

    OpenAIRE

    Karvonen, Vesa; Karvonen, Matti; Kraslawski, Andrzej

    2012-01-01

    The Porter´s value chain model was introduced for strategic business purposes. During the last decades also Universities and University based institutes have started to use actions similar to private business concepts. A University based institute is not independent actor like company but there are interest groups who are expecting them to act like they would be. This article discusses about the possibility of utilize tuned value chain to public research organizations (PRO). Also the interact...

  9. QUALITY IMPROVEMENT MODEL OF NURSING EDUCATION IN MUHAMMADIYAH UNIVERSITIES TOWARD COMPETITIVE ADVANTAGE

    Directory of Open Access Journals (Sweden)

    Abdul Aziz Alimul Hidayat

    2017-06-01

    Full Text Available Introduction: Most of (90,6% nursing education quality in East Java was still low (BAN-PT, 2012. It was because the quality improvement process in nursing education generally was conducted partially (random performance improvement. The solution which might be done was through identifying proper quality improvement model in Nursing Education toward competitive advantage. Method: This research used survey to gain the data. The research sample was 16 Muhammadiyah Universities chosen using simple random sampling. The data were collected with questionnaires of 174 questions and documentation study. Data analysis used was Partial Least Square (PLS analysis technique. Result: Nursing education department profile in Muhammadiyah Universities in Indonesia showed of 10 years establishment, accredited B and the competition level in one city/regency was averagely more than three Universities becoming the competitors. Based on the quality improvement model analysis of nursing education toward competitive advantage on Muhammadiyah Universities, it was directly affected by the focus of learning and operasional process through human resources management improvement, on the other hand information system also directly affected on quality improvement, also affected quality process components; leadership, human resources, focus of learning and operational process. In improving human resources would be directly influenced with proper strategic planning. Strategic planning was directly influenced with leadership. Thus, in improving quality of nursing education, the leadership role of department, proper information system, and thehuman resources management improvement must be implemented.  Conclusion: Quality improvement model in nursing education was directly determined with learning and operational process through human resources management along with information system, strategic planning factors, and leadership. The research finding could be developed in quality

  10. A universal calculation model for the controlled electric transmission line

    International Nuclear Information System (INIS)

    Zivzivadze, O.; Zivzivadze, L.

    2009-01-01

    Difficulties associated with the development of calculation models are analyzed, and the ways of resolution of these problems are given. A version of the equivalent circuit as a six-pole network, the parameters of which do not depend on the angle of shift Θ between the voltage vectors of circuits is offered. The interrelation between the parameters of the equivalent circuit and the transmission constants of the line was determined. A universal calculation model for the controlled electric transmission line was elaborated. The model allows calculating the stationary modes of lines of such classes at any angle of shift Θ between the circuits. (author)

  11. Ethics Literacy and "Ethics University": Two Intertwined Models for Public Involvement and Empowerment in Bioethics.

    Science.gov (United States)

    Strech, Daniel; Hirschberg, Irene; Meyer, Antje; Baum, Annika; Hainz, Tobias; Neitzke, Gerald; Seidel, Gabriele; Dierks, Marie-Luise

    2015-01-01

    Informing lay citizens about complex health-related issues and their related ethical, legal, and social aspects (ELSA) is one important component of democratic health care/research governance. Public information activities may be especially valuable when they are used in multi-staged processes that also include elements of information and deliberation. This paper presents a new model for a public involvement activity on ELSA (Ethics University) and evaluation data for a pilot event. The Ethics University is structurally based on the "patient university," an already established institution in some German medical schools, and the newly developed concept of "ethics literacy." The concept of "ethics literacy" consists of three levels: information, interaction, and reflection. The pilot project consisted of two series of events (lasting 4 days each). The thematic focus of the Ethics University pilot was ELSA of regenerative medicine. In this pilot, the concept of "ethics literacy" could be validated as its components were clearly visible in discussions with participants at the end of the event. The participants reacted favorably to the Ethics University by stating that they felt more educated with regard to the ELSA of regenerative medicine and with regard to their own abilities in normative reasoning on this topic. The Ethics University is an innovative model for public involvement and empowerment activities on ELSA theoretically underpinned by a concept for "ethics literacy." This model deserves further refinement, testing in other ELSA topics and evaluation in outcome research.

  12. Interacting new agegraphic tachyon, K-essence and dilaton scalar field models of dark energy in non-flat universe

    Energy Technology Data Exchange (ETDEWEB)

    Karami, K., E-mail: KKarami@uok.ac.i [Department of Physics, University of Kurdistan, Pasdaran St., Sanandaj (Iran, Islamic Republic of); Research Institute for Astronomy and Astrophysics of Maragha (RIAAM), Maragha (Iran, Islamic Republic of); Khaledian, M.S.; Felegary, F.; Azarmi, Z. [Department of Physics, University of Kurdistan, Pasdaran St., Sanandaj (Iran, Islamic Republic of)

    2010-03-29

    We study the correspondence between the tachyon, K-essence and dilaton scalar field models with the interacting new agegraphic dark energy model in the non-flat FRW universe. We reconstruct the potentials and the dynamics for these scalar field models, which describe accelerated expansion of the universe.

  13. Interacting new agegraphic tachyon, K-essence and dilaton scalar field models of dark energy in non-flat universe

    International Nuclear Information System (INIS)

    Karami, K.; Khaledian, M.S.; Felegary, F.; Azarmi, Z.

    2010-01-01

    We study the correspondence between the tachyon, K-essence and dilaton scalar field models with the interacting new agegraphic dark energy model in the non-flat FRW universe. We reconstruct the potentials and the dynamics for these scalar field models, which describe accelerated expansion of the universe.

  14. Impact of a Novel, Anti-microbial Dressing on In Vivo, Pseudomonas aeruginosa Wound Biofilm: Quantitative Comparative Analysis using a Rabbit Ear Model

    Science.gov (United States)

    2014-12-01

    therapies such as debridement , lavage, and antimicrobials, but with little evidence that they improve chronic wound healing in a quantitative and... TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Impact of a novel, anti-microbial dressing on in vivo, Pseudomonas aeruginosa wound biofilm...study. Bacterial strains and culture Wild- type strains of P. aeruginosa (obtained from the labora- tory of Dr. Barbara H. Iglewski, University of

  15. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    International Nuclear Information System (INIS)

    Qiu, Zeyang; Liang, Wei; Lin, Yang; Zhang, Meng; Wang, Xue

    2017-01-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor. (paper)

  16. Inservice trainings for Shiraz University of Medical Sciences employees: effectiveness assessment by using the CIPP model

    Directory of Open Access Journals (Sweden)

    MARYAM MOKHTARZADEGAN

    2015-04-01

    Full Text Available Introduction: Nowadays, the employees` inservice training has become one of the core components in survival and success of any organization. Unfortunately, despite the importance of training evaluation, a small portion of resources are allocated to this matter. Among many evaluation models, the CIPP model or Context, Input, Process, Product model is a very useful approach to educational evaluation. So far, the evaluation of the training courses mostly provided information for learners but this investigation aims at evaluating the effectiveness of the experts’ training programs and identifying its pros and cons based on the 4 stages of the CIPP model. Method: In this descriptive analytical study, done in 2013, 250 employees of Shiraz University Medical Sciences (SUMS participated in inservice training courses were randomly selected. The evaluated variables were designed using CIPP model and a researcher-made questionnaire was used for data collection; the questionnaire was validated using expert opinion and its reliability was confirmed by Cronbach’s alpha (0.89. Quantitative data were analyzed using SPSS 14 and statistical tests was done as needed. Results: In the context phase, the mean score was highest in solving work problems (4.07±0.88 and lowest in focusing on learners’ learning style training courses (2.68±0.91. There is a statistically significant difference between the employees` education level and the product phase evaluation (p0.001, in contrast with the process and product phase which showed a significant deference (p<0.001. Conclusion: Considering our results, although the inservice trainings given to sums employees has been effective in many ways, it has some weaknesses as well. Therefore improving these weaknesses and reinforcing strong points within the identified fields in this study should be taken into account by decision makers and administrators.

  17. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    Science.gov (United States)

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.

  18. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  19. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  20. Universal asymptotics in hyperbolicity breakdown

    International Nuclear Information System (INIS)

    Bjerklöv, Kristian; Saprykina, Maria

    2008-01-01

    We study a scenario for the disappearance of hyperbolicity of invariant tori in a class of quasi-periodic systems. In this scenario, the system loses hyperbolicity because two invariant directions come close to each other, losing their regularity. In a recent paper, based on numerical results, Haro and de la Llave (2006 Chaos 16 013120) discovered a quantitative universality in this scenario, namely, that the minimal angle between the two invariant directions has a power law dependence on the parameters and the exponents of the power law are universal. We present an analytic proof of this result

  1. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  2. Systematic Analysis of Quantitative Logic Model Ensembles Predicts Drug Combination Effects on Cell Signaling Networks

    Science.gov (United States)

    2016-08-27

    bovine serum albumin (BSA) diluted to the amount corresponding to that in the media of the stimulated cells. Phospho-JNK comprises two isoforms whose...information accompanies this paper on the CPT: Pharmacometrics & Systems Pharmacology website (http://www.wileyonlinelibrary.com/psp4) Systematic Analysis of Quantitative Logic Model Morris et al. 553 www.wileyonlinelibrary/psp4

  3. THE C.A.N.O.A. MODEL - A POSSIBLE IMPLEMENTATION IN ROMANIAN UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    Elena HLACIUC

    2017-06-01

    Full Text Available Globalisation, in addition to the many effects it has in all areas, determines, in what concerns higher education, a fierce competition between universities worldwide. This competition requires, as an essential element, that in addition to the services offered by the universities, they also develop tools to reveal their costs. Academic and financial performance are the two measures of the management of a university. Accounting supports the management of a university by its three facets, which form, together, the institution's accounting information system: budget implementation accounting, financial accounting, and management accounting. However, while budget implementation and financial accounting are well represented in Romania, the same cannot be said about management accounting. In this paper we shall analyse a possible application of management accounting in Romanian universities, using the C.A.N.O.A. model, a method that is currently used in Spain.

  4. Description of the University of Auckland Global Mars Mesoscale Meteorological Model (GM4)

    Science.gov (United States)

    Wing, D. R.; Austin, G. L.

    2005-08-01

    The University of Auckland Global Mars Mesoscale Meteorological Model (GM4) is a numerical weather prediction model of the Martian atmosphere that has been developed through the conversion of the Penn State University / National Center for Atmospheric Research fifth generation mesoscale model (MM5). The global aspect of this model is self consistent, overlapping, and forms a continuous domain around the entire planet, removing the need to provide boundary conditions other than at initialisation, yielding independence from the constraint of a Mars general circulation model. The brief overview of the model will be given, outlining the key physical processes and setup of the model. Comparison between data collected from Mars Pathfinder during its 1997 mission and simulated conditions using GM4 have been performed. Diurnal temperature variation as predicted by the model shows very good correspondence with the surface truth data, to within 5 K for the majority of the diurnal cycle. Mars Viking Data is also compared with the model, with good agreement. As a further means of validation for the model, various seasonal comparisons of surface and vertical atmospheric structure are conducted with the European Space Agency AOPP/LMD Mars Climate Database. Selected simulations over regions of interest will also be presented.

  5. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  6. A discrete stress-strength interference model based on universal generating function

    International Nuclear Information System (INIS)

    An Zongwen; Huang Hongzhong; Liu Yu

    2008-01-01

    Continuous stress-strength interference (SSI) model regards stress and strength as continuous random variables with known probability density function. This, to some extent, results in a limitation of its application. In this paper, stress and strength are treated as discrete random variables, and a discrete SSI model is presented by using the universal generating function (UGF) method. Finally, case studies demonstrate the validity of the discrete model in a variety of circumstances, in which stress and strength can be represented by continuous random variables, discrete random variables, or two groups of experimental data

  7. Quantitative histological models suggest endothermy in plesiosaurs

    Directory of Open Access Journals (Sweden)

    Corinna V. Fleischle

    2018-06-01

    Full Text Available Background Plesiosaurs are marine reptiles that arose in the Late Triassic and survived to the Late Cretaceous. They have a unique and uniform bauplan and are known for their very long neck and hydrofoil-like flippers. Plesiosaurs are among the most successful vertebrate clades in Earth’s history. Based on bone mass decrease and cosmopolitan distribution, both of which affect lifestyle, indications of parental care, and oxygen isotope analyses, evidence for endothermy in plesiosaurs has accumulated. Recent bone histological investigations also provide evidence of fast growth and elevated metabolic rates. However, quantitative estimations of metabolic rates and bone growth rates in plesiosaurs have not been attempted before. Methods Phylogenetic eigenvector maps is a method for estimating trait values from a predictor variable while taking into account phylogenetic relationships. As predictor variable, this study employs vascular density, measured in bone histological sections of fossil eosauropterygians and extant comparative taxa. We quantified vascular density as primary osteon density, thus, the proportion of vascular area (including lamellar infillings of primary osteons to total bone area. Our response variables are bone growth rate (expressed as local bone apposition rate and resting metabolic rate (RMR. Results Our models reveal bone growth rates and RMRs for plesiosaurs that are in the range of birds, suggesting that plesiosaurs were endotherm. Even for basal eosauropterygians we estimate values in the range of mammals or higher. Discussion Our models are influenced by the availability of comparative data, which are lacking for large marine amniotes, potentially skewing our results. However, our statistically robust inference of fast growth and fast metabolism is in accordance with other evidence for plesiosaurian endothermy. Endothermy may explain the success of plesiosaurs consisting in their survival of the end-Triassic extinction

  8. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    Science.gov (United States)

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  9. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  10. Education services quality of Kashan Medical Science University, based on SERVQUAL model in viewpoints of students

    Directory of Open Access Journals (Sweden)

    Ebrahim Kouchaki

    2017-01-01

    Full Text Available Introduction: Sustainable development of higher educational systems, as a dynamic system, requires a coherent moderate growth both in qualitative and quantitative dimensions. Since students are the major clients of higher education systems and their perspectives can play a key role in the quality promotion of the services; this study has been conducted based on SERVQUAL model aiming at the assessment of educational services quality in Kashan Medical Science University in 2016. Study Methodology: A total of 212 students of Kashan Medical Science University were selected with a population of 616 subjects through random sampling, using Morgan tables for this descriptive-analytical research. Data collection tools were the standard SERVQUAL questionnaire composing of three sections of basic information and 28 items, according to Likert six-option scale for the measurement of services quality current and desired expected conditions. The difference between the average of current and desirable statuses was measured as the services gap. Descriptive deductive statistics were used to analyze the obtained data. Results: The students aged averagely 23 ± 1.8, 65% (138 subjects were female, and 35% (74 subjects were male. About 72% (153 subjects were single, and 28% (59 subjects were married. The obtained results revealed that there was a negative gap in all dimensions of quality. The results also showed that the minimum gap obtained for learning assist tools (physical and tangibility dimensions with an amount of −0.38 and the maximum gap for guide instructor availability once needed by the students (accountability dimension with an amount of −2.42. Total mean of perceptions and expectations measurement for the students obtained 2.28 and 3.85, respectively. Conclusion: Respecting the negative gap obtained for all dimensions of educational services quality and insufficiencies to meet the students' expectations, it is recommended to assign further resources

  11. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  12. Comparative assessment of university chemistry undergraduate ...

    African Journals Online (AJOL)

    A comparative analysis of the structure of undergraduate chemistry curricula of universities in the southwest of Nigeria with a view to establishing the relative proportion of the different areas of chemistry each curriculum accommodates. It is a qualitative research, involving content analysis with a partial quantitative analysis ...

  13. Runaway universe

    Energy Technology Data Exchange (ETDEWEB)

    Davies, P

    1978-01-01

    The subject is covered in chapters entitled: the emerging universe (general introduction, history of astronomical and cosmological research, origins, the expanding universe, stars, galaxies, electromagnetic radiation); primeval fire (the big bang model, origin of the elements, properties of the elements and of sub-atomic particles); order out of chaos (galactic evolution, star formation, nuclear fusion, the solar system, origin of life on Earth); a star called Sol (properties of the sun and of other stars); life in the universe; the catastrophe principle (the rise and fall of cosmic order); stardoom (star evolution, neutron stars); black holes and superholes (gravitational collapse); technology and survival; the dying universe (second law of thermodynamics); worlds without end (cosmological models).

  14. THE NON-UNIVERSALITY OF THE LOW-MASS END OF THE IMF IS ROBUST AGAINST THE CHOICE OF SSP MODEL

    International Nuclear Information System (INIS)

    Spiniello, C.; Trager, S. C.; Koopmans, L. V. E.

    2015-01-01

    We perform a direct comparison of two state-of-the art single stellar population (SSP) models that have been used to demonstrate the non-universality of the low-mass end of the initial mass function (IMF) slope. The two public versions of the SSP models are restricted to either solar abundance patterns or solar metallicity, too restrictive if one aims to disentangle elemental enhancements, metallicity changes, and IMF variations in massive early-type galaxies (ETGs) with star formation histories different from those in the solar neighborhood. We define response functions (to metallicity and α-abundance) to extend the parameter space for each set of models. We compare these extended models with a sample of Sloan Digital Sky Survey (SDSS) ETG spectra with varying velocity dispersions. We measure equivalent widths of optical IMF-sensitive stellar features to examine the effect of the underlying model assumptions and ingredients, such as stellar libraries or isochrones, on the inference of the IMF slope down to ∼0.1 M ⊙ . We demonstrate that the steepening of the low-mass end of the IMF based on a non-degenerate set of spectroscopic optical indicators is robust against the choice of the stellar population model. Although the models agree in a relative sense (i.e., both imply more bottom-heavy IMFs for more massive systems), we find non-negligible differences in the absolute values of the IMF slope inferred at each velocity dispersion by using the two different models. In particular, we find large inconsistencies in the quantitative predictions of the IMF slope variations and abundance patterns when sodium lines are used. We investigate the possible reasons for these inconsistencies

  15. THE NON-UNIVERSALITY OF THE LOW-MASS END OF THE IMF IS ROBUST AGAINST THE CHOICE OF SSP MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Spiniello, C. [Max-Planck Institute for Astrophysics, Karl-Schwarzschild-Strasse 1, D-8l740 Garching (Germany); Trager, S. C.; Koopmans, L. V. E. [Kapteyn Astronomical Institute, University of Groningen, P.O. Box 800, 9700 AV Groningen (Netherlands)

    2015-04-20

    We perform a direct comparison of two state-of-the art single stellar population (SSP) models that have been used to demonstrate the non-universality of the low-mass end of the initial mass function (IMF) slope. The two public versions of the SSP models are restricted to either solar abundance patterns or solar metallicity, too restrictive if one aims to disentangle elemental enhancements, metallicity changes, and IMF variations in massive early-type galaxies (ETGs) with star formation histories different from those in the solar neighborhood. We define response functions (to metallicity and α-abundance) to extend the parameter space for each set of models. We compare these extended models with a sample of Sloan Digital Sky Survey (SDSS) ETG spectra with varying velocity dispersions. We measure equivalent widths of optical IMF-sensitive stellar features to examine the effect of the underlying model assumptions and ingredients, such as stellar libraries or isochrones, on the inference of the IMF slope down to ∼0.1 M{sub ⊙}. We demonstrate that the steepening of the low-mass end of the IMF based on a non-degenerate set of spectroscopic optical indicators is robust against the choice of the stellar population model. Although the models agree in a relative sense (i.e., both imply more bottom-heavy IMFs for more massive systems), we find non-negligible differences in the absolute values of the IMF slope inferred at each velocity dispersion by using the two different models. In particular, we find large inconsistencies in the quantitative predictions of the IMF slope variations and abundance patterns when sodium lines are used. We investigate the possible reasons for these inconsistencies.

  16. Model of Organizational Structure for University Institutes Binding with the Venezuelan Socioeconomic Reality

    Directory of Open Access Journals (Sweden)

    Rafael Pertuz Belloso

    2014-01-01

    Full Text Available The present study is aimed at proposing a model of organizational structure for university institutes binding with the Venezuelan socioeconomic reality. This is a descriptive non-experimental cross-sectional research study. The study population included 746 professors and administration from the Cabimas and Maracaibo Technological Universities. Data was collected using a questionnaire consisting of 54 items and analyzed using the percentage frequency distribution. Results obtained indicate the sub-systems not integrated in the studied institutions, coexisting bureaucratic structural typologies, and a clear decontextualized implementation of the nation’s plans, which shows low relevance and relationship to the Venezuelan socioeconomic reality. To remedy this situation, a mixed departmental/matrix organizational structure model was designed that integrates the department into a matrix network linking teaching, research, and social action projects. The implementation of this model was proposed in three stages or phases in order to achieve the operational characteristics of the departmental model.

  17. Anisotropic, nonsingular early universe model leading to a realistic cosmology

    International Nuclear Information System (INIS)

    Dechant, Pierre-Philippe; Lasenby, Anthony N.; Hobson, Michael P.

    2009-01-01

    We present a novel cosmological model in which scalar field matter in a biaxial Bianchi IX geometry leads to a nonsingular 'pancaking' solution: the hypersurface volume goes to zero instantaneously at the 'big bang', but all physical quantities, such as curvature invariants and the matter energy density remain finite, and continue smoothly through the big bang. We demonstrate that there exist geodesics extending through the big bang, but that there are also incomplete geodesics that spiral infinitely around a topologically closed spatial dimension at the big bang, rendering it, at worst, a quasiregular singularity. The model is thus reminiscent of the Taub-NUT vacuum solution in that it has biaxial Bianchi IX geometry and its evolution exhibits a dimensionality reduction at a quasiregular singularity; the two models are, however, rather different, as we will show in a future work. Here we concentrate on the cosmological implications of our model and show how the scalar field drives both isotropization and inflation, thus raising the question of whether structure on the largest scales was laid down at a time when the universe was still oblate (as also suggested by [T. S. Pereira, C. Pitrou, and J.-P. Uzan, J. Cosmol. Astropart. Phys. 9 (2007) 6.][C. Pitrou, T. S. Pereira, and J.-P. Uzan, J. Cosmol. Astropart. Phys. 4 (2008) 4.][A. Guemruekcueoglu, C. Contaldi, and M. Peloso, J. Cosmol. Astropart. Phys. 11 (2007) 005.]). We also discuss the stability of our model to small perturbations around biaxiality and draw an analogy with cosmological perturbations. We conclude by presenting a separate, bouncing solution, which generalizes the known bouncing solution in closed FRW universes.

  18. Uniform relativistic universe models with pressure. Part 2. Observational tests

    International Nuclear Information System (INIS)

    Krempec, J.; Krygier, B.

    1977-01-01

    The magnitude-redshift and angular diameter-redshift relations are discussed for the uniform (homogeneous and isotropic) relativistic Universe models with pressure. The inclusion of pressure into the energy-momentum tensor has given larger values of the deceleration parameter q. An increase of the deceleration parameter has led to the brightening of objects as well as to a little larger angular diameters. (author)

  19. Quantitative structure-activation barrier relationship modeling for Diels-Alder ligations utilizing quantum chemical structural descriptors.

    Science.gov (United States)

    Nandi, Sisir; Monesi, Alessandro; Drgan, Viktor; Merzel, Franci; Novič, Marjana

    2013-10-30

    In the present study, we show the correlation of quantum chemical structural descriptors with the activation barriers of the Diels-Alder ligations. A set of 72 non-catalysed Diels-Alder reactions were subjected to quantitative structure-activation barrier relationship (QSABR) under the framework of theoretical quantum chemical descriptors calculated solely from the structures of diene and dienophile reactants. Experimental activation barrier data were obtained from literature. Descriptors were computed using Hartree-Fock theory using 6-31G(d) basis set as implemented in Gaussian 09 software. Variable selection and model development were carried out by stepwise multiple linear regression methodology. Predictive performance of the quantitative structure-activation barrier relationship (QSABR) model was assessed by training and test set concept and by calculating leave-one-out cross-validated Q2 and predictive R2 values. The QSABR model can explain and predict 86.5% and 80% of the variances, respectively, in the activation energy barrier training data. Alternatively, a neural network model based on back propagation of errors was developed to assess the nonlinearity of the sought correlations between theoretical descriptors and experimental reaction barriers. A reasonable predictability for the activation barrier of the test set reactions was obtained, which enabled an exploration and interpretation of the significant variables responsible for Diels-Alder interaction between dienes and dienophiles. Thus, studies in the direction of QSABR modelling that provide efficient and fast prediction of activation barriers of the Diels-Alder reactions turn out to be a meaningful alternative to transition state theory based computation.

  20. Universal and blocking primer mismatches limit the use of high-throughput DNA sequencing for the quantitative metabarcoding of arthropods.

    Science.gov (United States)

    Piñol, J; Mir, G; Gomez-Polo, P; Agustí, N

    2015-07-01

    The quantification of the biological diversity in environmental samples using high-throughput DNA sequencing is hindered by the PCR bias caused by variable primer-template mismatches of the individual species. In some dietary studies, there is the added problem that samples are enriched with predator DNA, so often a predator-specific blocking oligonucleotide is used to alleviate the problem. However, specific blocking oligonucleotides could coblock nontarget species to some degree. Here, we accurately estimate the extent of the PCR biases induced by universal and blocking primers on a mock community prepared with DNA of twelve species of terrestrial arthropods. We also compare universal and blocking primer biases with those induced by variable annealing temperature and number of PCR cycles. The results show that reads of all species were recovered after PCR enrichment at our control conditions (no blocking oligonucleotide, 45 °C annealing temperature and 40 cycles) and high-throughput sequencing. They also show that the four factors considered biased the final proportions of the species to some degree. Among these factors, the number of primer-template mismatches of each species had a disproportionate effect (up to five orders of magnitude) on the amplification efficiency. In particular, the number of primer-template mismatches explained most of the variation (~3/4) in the amplification efficiency of the species. The effect of blocking oligonucleotide concentration on nontarget species relative abundance was also significant, but less important (below one order of magnitude). Considering the results reported here, the quantitative potential of the technique is limited, and only qualitative results (the species list) are reliable, at least when targeting the barcoding COI region. © 2014 John Wiley & Sons Ltd.

  1. Quantitative laser diagnostic and modeling study of C2 and CH chemistry in combustion.

    Science.gov (United States)

    Köhler, Markus; Brockhinke, Andreas; Braun-Unkhoff, Marina; Kohse-Höinghaus, Katharina

    2010-04-15

    Quantitative concentration measurements of CH and C(2) have been performed in laminar, premixed, flat flames of propene and cyclopentene with varying stoichiometry. A combination of cavity ring-down (CRD) spectroscopy and laser-induced fluorescence (LIF) was used to enable sensitive detection of these species with high spatial resolution. Previously, CH and C(2) chemistry had been studied, predominantly in methane flames, to understand potential correlations of their formation and consumption. For flames of larger hydrocarbon fuels, however, quantitative information on these small intermediates is scarce, especially under fuel-rich conditions. Also, the combustion chemistry of C(2) in particular has not been studied in detail, and although it has often been observed, its role in potential build-up reactions of higher hydrocarbon species is not well understood. The quantitative measurements performed here are the first to detect both species with good spatial resolution and high sensitivity in the same experiment in flames of C(3) and C(5) fuels. The experimental profiles were compared with results of combustion modeling to reveal details of the formation and consumption of these important combustion molecules, and the investigation was devoted to assist the further understanding of the role of C(2) and of its potential chemical interdependences with CH and other small radicals.

  2. New estimates on various critical/universal quantities of the 3d Ising model

    International Nuclear Information System (INIS)

    Hasenbusch, M.

    1998-01-01

    We present estimates for the 3D Ising model on the cubic lattice, both regarding interface and bulk properties. We have results for the interface tension, in particular the amplitude σ 0 in the critical law σ=ρ 0 t μ , and for the universal combination R - =σξ 2 . Concerning the bulk properties, we estimate the specific heat universal amplitude ratio A + /A - , together with the exponent α, the nonsingular background of energy and specific heat at criticality, together with the exponent ν. There are also results for the universal combination f s ξ 3 , where f s is the singular part of the free energy. (orig.)

  3. The New Leadership Model of University Management for Innovation and Entrepreneurship

    Science.gov (United States)

    Sart, Gamze

    2014-01-01

    Problem Statement: Today's ever-changing educational environment has created a need for new leadership styles that encourage positive change and improvement. In Turkish universities, the most commonly used leadership models are the classic and/or traditional ones, which lead to stagnation in innovation and entrepreneurship. Only a limited number…

  4. Can a matter-dominated model with constant bulk viscosity drive the accelerated expansion of the universe?

    International Nuclear Information System (INIS)

    Avelino, Arturo; Nucamendi, Ulises

    2009-01-01

    We test a cosmological model which the only component is a pressureless fluid with a constant bulk viscosity as an explanation for the present accelerated expansion of the universe. We classify all the possible scenarios for the universe predicted by the model according to their past, present and future evolution and we test its viability performing a Bayesian statistical analysis using the SCP ''Union'' data set (307 SNe Ia), imposing the second law of thermodynamics on the dimensionless constant bulk viscous coefficient ζ-tilde and comparing the predicted age of the universe by the model with the constraints coming from the oldest globular clusters. The best estimated values found for ζ-tilde and the Hubble constant H 0 are: ζ-tilde = 1.922±0.089 and H 0 = 69.62±0.59 (km/s)Mpc −1 with a χ 2 min = 314 (χ 2 d.o.f = 1.031). The age of the universe is found to be 14.95±0.42 Gyr. We see that the estimated value of H 0 as well as of χ 2 d.o.f are very similar to those obtained from ΛCDM model using the same SNe Ia data set. The estimated age of the universe is in agreement with the constraints coming from the oldest globular clusters. Moreover, the estimated value of ζ-tilde is positive in agreement with the second law of thermodynamics (SLT). On the other hand, we perform different forms of marginalization over the parameter H 0 in order to study the sensibility of the results to the way how H 0 is marginalized. We found that it is almost negligible the dependence between the best estimated values of the free parameters of this model and the way how H 0 is marginalized in the present work. Therefore, this simple model might be a viable candidate to explain the present acceleration in the expansion of the universe

  5. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibanez, Noelia; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp......Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  6. Geometry of the Universe

    International Nuclear Information System (INIS)

    Gurevich, L.Eh.; Gliner, Eh.B.

    1978-01-01

    Problems of investigating the Universe space-time geometry are described on a popular level. Immediate space-time geometries, corresponding to three cosmologic models are considered. Space-time geometry of a closed model is the spherical Riemann geonetry, of an open model - is the Lobachevskij geometry; and of a plane model - is the Euclidean geometry. The Universe real geometry in the contemporary epoch of development is based on the data testifying to the fact that the Universe is infinitely expanding

  7. Exploring the relationship between university internationalization and university autonomy

    DEFF Research Database (Denmark)

    Turcan, Romeo V.; Gullieva, Valeria

    This paper explores a research gap at the intersection of university internationalization and university autonomy. A process model of university internationalization is put forward whereby the process of university internationalization is mediated by university internationalization capacity...... and moderated by target country institutional autonomy and globalization; and entry modes, timing and pace, as well as product mix of internationalization define university’s internationalization pattern. A systematic review is conducted to identify empirical studies at this intersection. One of the questions...

  8. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    International Nuclear Information System (INIS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-01-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine. - Highlights: • Both training and testing samples are considered for analytical lines selection. • The analytical lines are auto-selected based on the built-in characteristics of spectral lines. • The new method can achieve better prediction accuracy and modeling robustness. • Model predictions are given with confidence interval of probabilistic distribution

  9. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    Science.gov (United States)

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  10. The Target Model of Strategic Interaction of Kazan Federal University and the Region in the Field of Education

    Science.gov (United States)

    Gabdulchakov, Valerian F.

    2016-01-01

    The subject of the study in the article is conceptual basis of construction of the target model of interaction between University and region. Hence the topic of the article "the Target model of strategic interaction between the University and the region in the field of education." The objective was to design a target model of this…

  11. Cultural Capital, Family Background and Education: Choosing University Subjects in China

    Science.gov (United States)

    Sheng, Xiaoming

    2017-01-01

    This article employs Bourdieu's conceptual tools to unpack family influences on students' subject and university choices in China. This empirical study employed mixed research approaches, using both quantitative and qualitative methods, to examine students' choices of subjects and universities in a sample of secondary school students from the age…

  12. Understanding University Students' Thoughts and Practices about Digital Citizenship: A Mixed Methods Study

    Science.gov (United States)

    Kara, Nuri

    2018-01-01

    The purpose of this study was to investigate university students' thoughts and practices concerning digital citizenship. An explanatory mixed methods design was used, and it involved collecting qualitative data after a quantitative phase in order to follow up on the quantitative data in more depth. In the first quantitative phase of the study, a…

  13. A Structural Equation Model of Knowledge Management Based On Organizational Climate in Universities

    OpenAIRE

    F. Nazem; M. Mozaiini; A. Seifi

    2014-01-01

    The purpose of the present study was to provide a structural model of knowledge management in universities based on organizational climate. The population of the research included all employees of Islamic Azad University (IAU). The sample consisted of 1590 employees selected using stratified and cluster random sampling method. The research instruments were two questionnaires which were administered in 78 IAU branches and education centers: Sallis and Jones’s (2002) Knowledge Management Questi...

  14. STRUCTURAL AND FUNCTIONAL MODEL OF FORMING INFORMATIONAL COMPETENCE OF TECHNICAL UNIVERSITY STUDENTS

    Directory of Open Access Journals (Sweden)

    Taras Ostapchuk

    2016-11-01

    Full Text Available The article elaborates and analyses the structural and functional model of formation of information competence of technical university students. The system and mutual relationships between its elements are revealed. It is found out that the presence of the target structure of the proposed model, process and result-evaluative blocks ensure its functioning and the opportunity to optimize the learning process for technical school students’ information training. It is established that the formation of technical university students’ information competence based on components such as motivational value, as well as operational activity, cognitive, and reflexive one. These criteria (motivation, operational and activity, cognitive, reflective, indexes and levels (reproductive, technologized, constructive forming technical university students’ information competence are disclosed. Expediency of complex organizational and educational conditions in the stages of information competence is justified. The complex organizational and pedagogical conditions include: orientation in the organization and implementation of class work for technical university students’ positive value treatment; the issue of forming professionalism; informatization of educational and socio-cultural environment of higher technical educational institutions; orientation of technical university students’ training to the demands of European and international standards on information competence as a factor in the formation of competitiveness at the labor market; introducing a special course curriculum that will provide competence formation due to the use of information technology in professional activities. Forms (lecture, visualization, problem lecture, combined lecture, scientific online conference, recitals, excursions, etc., tools (computer lab, multimedia projector, interactive whiteboard, multimedia technology (audio, video, the Internet technologies; social networks, etc

  15. Advanced quantitative measurement methodology in physics education research

    Science.gov (United States)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  16. Universality of correlation functions in random matrix models of QCD

    International Nuclear Information System (INIS)

    Jackson, A.D.; Sener, M.K.; Verbaarschot, J.J.M.

    1997-01-01

    We demonstrate the universality of the spectral correlation functions of a QCD inspired random matrix model that consists of a random part having the chiral structure of the QCD Dirac operator and a deterministic part which describes a schematic temperature dependence. We calculate the correlation functions analytically using the technique of Itzykson-Zuber integrals for arbitrary complex supermatrices. An alternative exact calculation for arbitrary matrix size is given for the special case of zero temperature, and we reproduce the well-known Laguerre kernel. At finite temperature, the microscopic limit of the correlation functions are calculated in the saddle-point approximation. The main result of this paper is that the microscopic universality of correlation functions is maintained even though unitary invariance is broken by the addition of a deterministic matrix to the ensemble. (orig.)

  17. Measuring our Universe from Galaxy Redshift Surveys.

    Science.gov (United States)

    Lahav, Ofer; Suto, Yasushi

    2004-01-01

    Galaxy redshift surveys have achieved significant progress over the last couple of decades. Those surveys tell us in the most straightforward way what our local Universe looks like. While the galaxy distribution traces the bright side of the Universe, detailed quantitative analyses of the data have even revealed the dark side of the Universe dominated by non-baryonic dark matter as well as more mysterious dark energy (or Einstein's cosmological constant). We describe several methodologies of using galaxy redshift surveys as cosmological probes, and then summarize the recent results from the existing surveys. Finally we present our views on the future of redshift surveys in the era of precision cosmology.

  18. Need for collection of quantitative distribution data for dosimetry and metabolic modeling

    International Nuclear Information System (INIS)

    Lathrop, K.A.

    1976-01-01

    Problems in radiation dose distribution studies in humans are discussed. Data show the effective half-times for 7 Be and 75 Se in the mouse, rat, monkey, dog, and human show no correlation with weight, body surface, or other readily apparent factor that could be used to equate nonhuman and human data. Another problem sometimes encountered in attempting to extrapolate animal data to humans involves equivalent doses of the radiopharmaceutical. A usual human dose for a radiopharmaceutical is 1 ml or 0.017 mg/kg. The same solution injected into a mouse in a convenient volume of 0.1 ml results in a dose of 4 ml/kg or 240 times that received by the human. The effect on whole body retention produced by a dose difference of similar magnitude for selenium in the rat shows the retention is at least twice as great with the smaller amount. With the development of methods for the collection of data throughout the body representing the fractional distribution of radioactivity versus time, not only can more realistic dose estimates be made, but also the tools will be provided for the study of physiological and biochemical interrelationships in the intact subject from which compartmental models may be made which have diagnostic significance. The unique requirement for quantitative biologic data needed for calculation of radiation absorbed doses is the same as the unique scientific contribution that nuclear medicine can make, which is the quantitative in vivo study of physiologic and biochemical processes. The technique involved is not the same as quantitation of a radionuclide image, but is a step beyond

  19. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    International Nuclear Information System (INIS)

    Kelly, Dana L.; Boring, Ronald L.; Mosleh, Ali; Smidts, Carol

    2011-01-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  20. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  1. University Community Engagement and the Strategic Planning Process

    Directory of Open Access Journals (Sweden)

    Laura Newton Miller

    2018-03-01

    Full Text Available Abstract  Objectives – To understand how university libraries are engaging with the university community (students, faculty, campus partners, and administration when working through the strategic planning process.  Methods – Literature review and exploratory open-ended survey to members of CAUL (Council of Australian University Librarians, CARL (Canadian Association of Research Libraries, CONZUL (Council of New Zealand University Librarians, and RLUK (Research Libraries UK who are most directly involved in the strategic planning process at their library.  Results – Out of a potential 113 participants from 4 countries, 31 people (27% replied to the survey. Libraries most often mentioned the use of regularly-scheduled surveys to inform their strategic planning, which helps to truncate the process for some respondents, as opposed to conducting user feedback specifically for the strategic planning process. Other quantitative methods include customer intelligence and library-produced data. Qualitative methods include the use of focus groups, interviews, and user experience/design techniques to help inform the strategic plan. The focus of questions to users tended to fall towards user-focused (with or without library lens, library-focused, trends and vision, and feedback on plan.  Conclusions – Combining both quantitative and qualitative methods can help give a fuller picture for librarians working on a strategic plan. Having the university community join the conversation on how the library moves forward is an important but difficult endeavour.  Regardless, the university library needs to be adaptive to the rapidly changing environment around it. Having a sense of how other libraries engage with the university community benefits others who are tasked with strategic planning.

  2. Quantitative Psychology : the 82nd Annual Meeting of the Psychometric Society

    CERN Document Server

    Culpepper, Steven; Janssen, Rianne; González, Jorge; Molenaar, Dylan

    2018-01-01

    This proceedings book highlights the latest research and developments in psychometrics and statistics. Featuring contributions presented at the 82nd Annual Meeting of the Psychometric Society (IMPS), organized by the University of Zurich and held in Zurich, Switzerland from July 17 to 21, 2017, its 34 chapters address a diverse range of psychometric topics including item response theory, factor analysis, causal inference, Bayesian statistics, test equating, cognitive diagnostic models and multistage adaptive testing. The IMPS is one of the largest international meetings on quantitative measurement in psychology, education and the social sciences, attracting over 500 participants and 250 paper presentations from around the world every year. This book gathers the contributions of selected presenters, which were subsequently expanded and peer-reviewed.

  3. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  4. The Charlotte Action Research Project: A Model for Direct and Mutually Beneficial Community-University Engagement

    Science.gov (United States)

    Morrell, Elizabeth; Sorensen, Janni; Howarth, Joe

    2015-01-01

    This article describes the evolution of the Charlotte Action Research Project (CHARP), a community-university partnership founded in 2008 at the University of North Carolina at Charlotte, and focuses particularly on the program's unique organizational structure. Research findings of a project evaluation suggest that the CHARP model's unique…

  5. Universal model for water costs of gas exchange by animals and plants

    OpenAIRE

    Woods, H. Arthur; Smith, Jennifer N.

    2010-01-01

    For terrestrial animals and plants, a fundamental cost of living is water vapor lost to the atmosphere during exchange of metabolic gases. Here, by bringing together previously developed models for specific taxa, we integrate properties common to all terrestrial gas exchangers into a universal model of water loss. The model predicts that water loss scales to gas exchange with an exponent of 1 and that the amount of water lost per unit of gas exchanged depends on several factors: the surface t...

  6. [Modern model of organization of pedagogical process in physical education of students in universities

    OpenAIRE

    Bashavets, N.A.

    2016-01-01

    Current studies are characterized by active development of models of physical education students (sectional, professionally oriented, individual, improving traditional etc.). The author, based on analysis of international experience, tryed to determine the most appropriate model of physical education in Ukrainian universities

  7. Late time acceleration of the universe in f(R) gravity model

    International Nuclear Information System (INIS)

    Mukherjee, Ankan

    2014-01-01

    In this work, a new way to look at the nature of late time dynamics of the universe for f(R) gravity models using the contracted Bianchi Identity has been proposed. As the Einstein field equations contain derivatives of the curvature scalar R, the contracted Bianchi identity yields a second order nonlinear differential equation in H, the Hubble parameter. This equation is studied for two particular forms of f(R), and the late time behaviour of the model is discussed. (author)

  8. Perceptions of Nigerian university students about the influence of ...

    African Journals Online (AJOL)

    Perceptions of Nigerian university students about the influence of cigarette advertisement on smoking habit: a ... students about the influence of cigarette advertisement on smoking habit: a quantitative analysis ... AJOL African Journals Online.

  9. Nonfixed Retirement Age for University Professors: Modeling Its Effects on New Faculty Hires.

    Science.gov (United States)

    Larson, Richard C; Diaz, Mauricio Gomez

    2012-03-01

    We model the set of tenure-track faculty members at a university as a queue, where "customers" in queue are faculty members in active careers. Arrivals to the queue are usually young, untenured assistant professors, and departures from the queue are primarily those who do not pass a promotion or tenure hurdle and those who retire. There are other less-often-used ways to enter and leave the queue. Our focus is on system effects of the elimination of mandatory retirement age. In particular, we are concerned with estimating the number of assistant professor slots that annually are no longer available because of the elimination of mandatory retirement. We start with steady-state assumptions that require use of Little's Law of Queueing, and we progress to a transient model using system dynamics. We apply these simple models using available data from our home university, the Massachusetts Institute of Technology.

  10. Factors Affecting University Students' Intention to Use Cloud Computing in Jordan

    Science.gov (United States)

    Rababah, Khalid Ali; Khasawneh, Mohammad; Nassar, Bilal

    2017-01-01

    The aim of this study is to examine the factors affecting students' intention to use cloud computing in the Jordanian universities. To achieve this purpose, a quantitative research approach which is a survey-based was deployed. Around 400 questionnaires were distributed randomly to Information Technology (IT) students at four universities in…

  11. Modeling Cancer Metastasis using Global, Quantitative and Integrative Network Biology

    DEFF Research Database (Denmark)

    Schoof, Erwin; Erler, Janine

    understanding of molecular processes which are fundamental to tumorigenesis. In Article 1, we propose a novel framework for how cancer mutations can be studied by taking into account their effect at the protein network level. In Article 2, we demonstrate how global, quantitative data on phosphorylation dynamics...... can be generated using MS, and how this can be modeled using a computational framework for deciphering kinase-substrate dynamics. This framework is described in depth in Article 3, and covers the design of KinomeXplorer, which allows the prediction of kinases responsible for modulating observed...... phosphorylation dynamics in a given biological sample. In Chapter III, we move into Integrative Network Biology, where, by combining two fundamental technologies (MS & NGS), we can obtain more in-depth insights into the links between cellular phenotype and genotype. Article 4 describes the proof...

  12. An Evaluation Model of Quantitative and Qualitative Fuzzy Multi-Criteria Decision-Making Approach for Location Selection of Transshipment Ports

    Directory of Open Access Journals (Sweden)

    Ji-Feng Ding

    2013-01-01

    Full Text Available The role of container logistics centre as home bases for merchandise transportation has become increasingly important. The container carriers need to select a suitable centre location of transshipment port to meet the requirements of container shipping logistics. In the light of this, the main purpose of this paper is to develop a fuzzy multi-criteria decision-making (MCDM model to evaluate the best selection of transshipment ports for container carriers. At first, some concepts and methods used to develop the proposed model are briefly introduced. The performance values of quantitative and qualitative subcriteria are discussed to evaluate the fuzzy ratings. Then, the ideal and anti-ideal concepts and the modified distance measure method are used in the proposed model. Finally, a step-by-step example is illustrated to study the computational process of the quantitative and qualitative fuzzy MCDM model. The proposed approach has successfully accomplished our goal. In addition, the proposed fuzzy MCDM model can be empirically employed to select the best location of transshipment port for container carriers in the future study.

  13. Knowledge portal: a tool to capture university requirements

    Science.gov (United States)

    Mansourvar, Marjan; Binti Mohd Yasin, Norizan

    2011-10-01

    New technologies, especially, the Internet have made a huge impact on knowledge management and information dissemination in education. The web portal as a knowledge management system is very popular topics in many organizations including universities. Generally, a web portal defines as a gateway to online network accessible resources through the intranet, extranet or Internet. This study develops a knowledge portal for the students in the Faculty of Computer Science and Information Technology (FCSIT), University of Malaya (UM). The goals of this portal are to provide information for the students to help them to choose the right courses and major that are relevant to their intended future jobs or career in IT. A quantitative approach used as the selected method for this research. Quantitative method provides an easy and useful way to collect data from a large sample population.

  14. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Science.gov (United States)

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  15. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Directory of Open Access Journals (Sweden)

    Sho Manabe

    Full Text Available In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  16. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  17. Quantitative structure–activity relationship model for amino acids as corrosion inhibitors based on the support vector machine and molecular design

    International Nuclear Information System (INIS)

    Zhao, Hongxia; Zhang, Xiuhui; Ji, Lin; Hu, Haixiang; Li, Qianshu

    2014-01-01

    Highlights: • Nonlinear quantitative structure–activity relationship (QSAR) model was built by the support vector machine. • Descriptors for QSAR model were selected by principal component analysis. • Binding energy was taken as one of the descriptors for QSAR model. • Acidic solution and protonation of the inhibitor were considered. - Abstract: The inhibition performance of nineteen amino acids was studied by theoretical methods. The affection of acidic solution and protonation of inhibitor were considered in molecular dynamics simulation and the results indicated that the protonated amino-group was not adsorbed on Fe (1 1 0) surface. Additionally, a nonlinear quantitative structure–activity relationship (QSAR) model was built by the support vector machine. The correlation coefficient was 0.97 and the root mean square error, the differences between predicted and experimental inhibition efficiencies (%), was 1.48. Furthermore, five new amino acids were theoretically designed and their inhibition efficiencies were predicted by the built QSAR model

  18. Dynamic inundation mapping of Hurricane Harvey flooding in the Houston metro area using hyper-resolution modeling and quantitative image reanalysis

    Science.gov (United States)

    Noh, S. J.; Lee, J. H.; Lee, S.; Zhang, Y.; Seo, D. J.

    2017-12-01

    Hurricane Harvey was one of the most extreme weather events in Texas history and left significant damages in the Houston and adjoining coastal areas. To understand better the relative impact to urban flooding of extreme amount and spatial extent of rainfall, unique geography, land use and storm surge, high-resolution water modeling is necessary such that natural and man-made components are fully resolved. In this presentation, we reconstruct spatiotemporal evolution of inundation during Hurricane Harvey using hyper-resolution modeling and quantitative image reanalysis. The two-dimensional urban flood model used is based on dynamic wave approximation and 10 m-resolution terrain data, and is forced by the radar-based multisensor quantitative precipitation estimates. The model domain includes Buffalo, Brays, Greens and White Oak Bayous in Houston. The model is simulated using hybrid parallel computing. To evaluate dynamic inundation mapping, we combine various qualitative crowdsourced images and video footages with LiDAR-based terrain data.

  19. Pro-Social Behavior Amongst Students of Tertiary Institutions: An Explorative and a Quantitative Approach

    Science.gov (United States)

    Quain, Samuel; Yidana, Xiaaba Dantallah; Ambotumah, Bernard Baba; Mensah-Livivnstone, Ike Joe Nii Annang

    2016-01-01

    The purpose of this paper was to explore antecedents of pro-social behavior amongst university students, using a private university as a case study. Following an explorative research, the study was guided by some theories relating to the phenomenon, focusing on gender and location factors. A quantitative approach was used in the follow up to the…

  20. A Qualitative and Quantitative Assay to Study DNA/Drug Interaction ...

    African Journals Online (AJOL)

    Research Article. A Qualitative and Quantitative Assay to Study. DNA/Drug Interaction Based on Sequence Selective. Inhibition of Restriction Endonucleases. Syed A Hassan1*, Lata Chauhan2, Ritu Barthwal2 and Aparna Dixit3. 1 Faculty of Computing and Information Technology, King Abdul Aziz University, Rabigh-21911 ...