WorldWideScience

Sample records for rate theory model

  1. Rate Theory Modeling and Simulation of Silicide Fuel at LWR Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Miao, Yinbin [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Ye, Bei [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Hofman, Gerard [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Yacout, Abdellatif [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Gamble, Kyle [Idaho National Lab. (INL), Idaho Falls, ID (United States). Fuel Modeling and Simulation; Mei, Zhi-Gang [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division

    2016-08-29

    As a promising candidate for the accident tolerant fuel (ATF) used in light water reactors (LWRs), the fuel performance of uranium silicide (U3Si2) at LWR conditions needs to be well understood. In this report, rate theory model was developed based on existing experimental data and density functional theory (DFT) calculations so as to predict the fission gas behavior in U3Si2 at LWR conditions. The fission gas behavior of U3Si2 can be divided into three temperature regimes. During steady-state operation, the majority of the fission gas stays in intragranular bubbles, whereas the dominance of intergranular bubbles and fission gas release only occurs beyond 1000 K. The steady-state rate theory model was also used as reference to establish a gaseous swelling correlation of U3Si2 for the BISON code. Meanwhile, the overpressurized bubble model was also developed so that the fission gas behavior at LOCA can be simulated. LOCA simulation showed that intragranular bubbles are still dominant after a 70 second LOCA, resulting in a controllable gaseous swelling. The fission gas behavior of U3Si2 at LWR conditions is benign according to the rate theory prediction at both steady-state and LOCA conditions, which provides important references to the qualification of U3Si2 as a LWR fuel material with excellent fuel performance and enhanced accident tolerance.

  2. Rate-distortion theory and human perception.

    Science.gov (United States)

    Sims, Chris R

    2016-07-01

    The fundamental goal of perception is to aid in the achievement of behavioral objectives. This requires extracting and communicating useful information from noisy and uncertain sensory signals. At the same time, given the complexity of sensory information and the limitations of biological information processing, it is necessary that some information must be lost or discarded in the act of perception. Under these circumstances, what constitutes an 'optimal' perceptual system? This paper describes the mathematical framework of rate-distortion theory as the optimal solution to the problem of minimizing the costs of perceptual error subject to strong constraints on the ability to communicate or transmit information. Rate-distortion theory offers a general and principled theoretical framework for developing computational-level models of human perception (Marr, 1982). Models developed in this framework are capable of producing quantitatively precise explanations for human perceptual performance, while yielding new insights regarding the nature and goals of perception. This paper demonstrates the application of rate-distortion theory to two benchmark domains where capacity limits are especially salient in human perception: discrete categorization of stimuli (also known as absolute identification) and visual working memory. A software package written for the R statistical programming language is described that aids in the development of models based on rate-distortion theory. Copyright © 2016 The Author. Published by Elsevier B.V. All rights reserved.

  3. Situated learning theory: adding rate and complexity effects via Kauffman's NK model.

    Science.gov (United States)

    Yuan, Yu; McKelvey, Bill

    2004-01-01

    For many firms, producing information, knowledge, and enhancing learning capability have become the primary basis of competitive advantage. A review of organizational learning theory identifies two approaches: (1) those that treat symbolic information processing as fundamental to learning, and (2) those that view the situated nature of cognition as fundamental. After noting that the former is inadequate because it focuses primarily on behavioral and cognitive aspects of individual learning, this paper argues the importance of studying learning as interactions among people in the context of their environment. It contributes to organizational learning in three ways. First, it argues that situated learning theory is to be preferred over traditional behavioral and cognitive learning theories, because it treats organizations as complex adaptive systems rather than mere information processors. Second, it adds rate and nonlinear learning effects. Third, following model-centered epistemology, it uses an agent-based computational model, in particular a "humanized" version of Kauffman's NK model, to study the situated nature of learning. Using simulation results, we test eight hypotheses extending situated learning theory in new directions. The paper ends with a discussion of possible extensions of the current study to better address key issues in situated learning.

  4. Rate Theory Modeling and Simulations of Silicide Fuel at LWR Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Miao, Yinbin [Argonne National Lab. (ANL), Argonne, IL (United States); Ye, Bei [Argonne National Lab. (ANL), Argonne, IL (United States); Mei, Zhigang [Argonne National Lab. (ANL), Argonne, IL (United States); Hofman, Gerard [Argonne National Lab. (ANL), Argonne, IL (United States); Yacout, Abdellatif [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-12-10

    Uranium silicide (U3Si2) fuel has higher thermal conductivity and higher uranium density, making it a promising candidate for the accident-tolerant fuel (ATF) used in light water reactors (LWRs). However, previous studies on the fuel performance of U3Si2, including both experimental and computational approaches, have been focusing on the irradiation conditions in research reactors, which usually involve low operation temperatures and high fuel burnups. Thus, it is important to examine the fuel performance of U3Si2 at typical LWR conditions so as to evaluate the feasibility of replacing conventional uranium dioxide fuel with this silicide fuel material. As in-reactor irradiation experiments involve significant time and financial cost, it is appropriate to utilize modeling tools to estimate the behavior of U3Si2 in LWRs based on all those available research reactor experimental references and state-of-the-art density functional theory (DFT) calculation capabilities at the early development stage. Hence, in this report, a comprehensive investigation of the fission gas swelling behavior of U3Si2 at LWR conditions is introduced. The modeling efforts mentioned in this report was based on the rate theory (RT) model of fission gas bubble evolution that has been successfully applied for a variety of fuel materials at devious reactor conditions. Both existing experimental data and DFT-calculated results were used for the optimization of the parameters adopted by the RT model. Meanwhile, the fuel-cladding interaction was captured by the coupling of the RT model with simplified mechanical correlations. Therefore, the swelling behavior of U3Si2 fuel and its consequent interaction with cladding in LWRs was predicted by the rate theory modeling, providing valuable information for the development of U3Si2 fuel as an accident

  5. Comparison of rate theory based modeling calculations with the surveillance test results of Korean light water reactors

    International Nuclear Information System (INIS)

    Lee, Gyeong Geun; Lee, Yong Bok; Kim, Min Chul; Kwon, Junh Yun

    2012-01-01

    Neutron irradiation to reactor pressure vessel (RPV) steels causes a decrease in fracture toughness and an increase in yield strength while in service. It is generally accepted that the growth of point defect cluster (PDC) and copper rich precipitate (CRP) affects radiation hardening of RPV steels. A number of models have been proposed to account for the embrittlement of RPV steels. The rate theory based modeling mathematically described the evolution of radiation induced microstructures of ferritic steels under neutron irradiation. In this work, we compared the rate theory based modeling calculation with the surveillance test results of Korean Light Water Reactors (LWRs)

  6. Rate theory

    International Nuclear Information System (INIS)

    Maillard, S.; Skorek, R.; Maugis, P.; Dumont, M.

    2015-01-01

    This chapter presents the basic principles of cluster dynamics as a particular case of mesoscopic rate theory models developed to investigate fuel behaviour under irradiation such as in UO 2 . It is shown that as this method simulates the evolution of the concentration of every type of point or aggregated defect in a grain of material. It produces rich information that sheds light on the mechanisms involved in microstructure evolution and gas behaviour that are not accessible through conventional models but yet can provide for improvements in those models. Cluster dynamics parameters are mainly the energetic values governing the basic evolution mechanisms of the material (diffusion, trapping and thermal resolution). In this sense, the model has a general applicability to very different operational situations (irradiation, ion-beam implantation, annealing) provided that they rely on the same basic mechanisms, without requiring additional data fitting, as is required for more empirical conventional models. This technique, when applied to krypton implanted and annealed samples, yields a precise interpretation of the release curves and helps assess migration mechanisms and the krypton diffusion coefficient, for which data is very difficult to obtain due to the low solubility of the gas. (authors)

  7. A numerical basis for strain-gradient plasticity theory: Rate-independent and rate-dependent formulations

    DEFF Research Database (Denmark)

    Nielsen, Kim Lau; Niordson, Christian Frithiof

    2014-01-01

    of a single plastic zone is analyzed to illustrate the agreement with earlier published results, whereafter examples of (ii) multiple plastic zone interaction, and (iii) elastic–plastic loading/unloading are presented. Here, the simple shear problem of an infinite slab constrained between rigid plates......A numerical model formulation of the higher order flow theory (rate-independent) by Fleck and Willis [2009. A mathematical basis for strain-gradient plasticity theory – part II: tensorial plastic multiplier. Journal of the Mechanics and Physics of Solids 57, 1045-1057.], that allows for elastic–plastic...... loading/unloading and the interaction of multiple plastic zones, is proposed. The predicted model response is compared to the corresponding rate-dependent version of visco-plastic origin, and coinciding results are obtained in the limit of small strain-rate sensitivity. First, (i) the evolution...

  8. Theory of nanolaser devices: Rate equation analysis versus microscopic theory

    DEFF Research Database (Denmark)

    Lorke, Michael; Skovgård, Troels Suhr; Gregersen, Niels

    2013-01-01

    A rate equation theory for quantum-dot-based nanolaser devices is developed. We show that these rate equations are capable of reproducing results of a microscopic semiconductor theory, making them an appropriate starting point for complex device simulations of nanolasers. The input...

  9. Combination of poroelasticity theory and constant strain rate test in modelling land subsidence due to groundwater extraction

    Science.gov (United States)

    Pham, Tien Hung; Rühaak, Wolfram; Sass, Ingo

    2017-04-01

    Extensive groundwater extraction leads to a drawdown of the ground water table. Consequently, soil effective stress increases and can cause land subsidence. Analysis of land subsidence generally requires a numerical model based on poroelasticity theory, which was first proposed by Biot (1941). In the review of regional land subsidence accompanying groundwater extraction, Galloway and Burbey (2011) stated that more research and application is needed in coupling of stress-dependent land subsidence process. In geotechnical field, the constant rate of strain tests (CRS) was first introduced in 1969 (Smith and Wahls 1969) and was standardized in 1982 through the designation D4186-82 by American Society for Testing and Materials. From the reading values of CRS tests, the stress-dependent parameters of poroelasticity model can be calculated. So far, there is no research to link poroelasticity theory with CRS tests in modelling land subsidence due to groundwater extraction. One dimensional CRS tests using conventional compression cell and three dimension CRS tests using Rowe cell were performed. The tests were also modelled by using finite element method with mixed elements. Back analysis technique is used to find the suitable values of hydraulic conductivity and bulk modulus that depend on the stress or void ratio. Finally, the obtained results are used in land subsidence models. Biot, M. A. (1941). "General theory of three-dimensional consolidation." Journal of applied physics 12(2): 155-164. Galloway, D. L. and T. J. Burbey (2011). "Review: Regional land subsidence accompanying groundwater extraction." Hydrogeology Journal 19(8): 1459-1486. Smith, R. E. and H. E. Wahls (1969). "Consolidation under constant rates of strain." Journal of Soil Mechanics & Foundations Div.

  10. Information theory and rate distortion theory for communications and compression

    CERN Document Server

    Gibson, Jerry

    2013-01-01

    This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the cover

  11. A quantitative theory of solid tumor growth, metabolic rate and vascularization.

    Directory of Open Access Journals (Sweden)

    Alexander B Herman

    Full Text Available The relationships between cellular, structural and dynamical properties of tumors have traditionally been studied separately. Here, we construct a quantitative, predictive theory of solid tumor growth, metabolic rate, vascularization and necrosis that integrates the relationships between these properties. To accomplish this, we develop a comprehensive theory that describes the interface and integration of the tumor vascular network and resource supply with the cardiovascular system of the host. Our theory enables a quantitative understanding of how cells, tissues, and vascular networks act together across multiple scales by building on recent theoretical advances in modeling both healthy vasculature and the detailed processes of angiogenesis and tumor growth. The theory explicitly relates tumor vascularization and growth to metabolic rate, and yields extensive predictions for tumor properties, including growth rates, metabolic rates, degree of necrosis, blood flow rates and vessel sizes. Besides these quantitative predictions, we explain how growth rates depend on capillary density and metabolic rate, and why similar tumors grow slower and occur less frequently in larger animals, shedding light on Peto's paradox. Various implications for potential therapeutic strategies and further research are discussed.

  12. Item Response Theory Analyses of the Parent and Teacher Ratings of the DSM-IV ADHD Rating Scale

    Science.gov (United States)

    Gomez, Rapson

    2008-01-01

    The graded response model (GRM), which is based on item response theory (IRT), was used to evaluate the psychometric properties of the inattention and hyperactivity/impulsivity symptoms in an ADHD rating scale. To accomplish this, parents and teachers completed the DSM-IV ADHD Rating Scale (DARS; Gomez et al., "Journal of Child Psychology and…

  13. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  14. Basic Exchange Rate Theories

    NARCIS (Netherlands)

    J.G.M. van Marrewijk (Charles)

    2005-01-01

    textabstractThis four-chapter overview of basic exchange rate theories discusses (i) the elasticity and absorption approach, (ii) the (long-run) implications of the monetary approach, (iii) the short-run effects of monetary and fiscal policy under various economic conditions, and (iv) the transition

  15. A dual theory of price and value in a meso-scale economic model with stochastic profit rate

    Science.gov (United States)

    Greenblatt, R. E.

    2014-12-01

    The problem of commodity price determination in a market-based, capitalist economy has a long and contentious history. Neoclassical microeconomic theories are based typically on marginal utility assumptions, while classical macroeconomic theories tend to be value-based. In the current work, I study a simplified meso-scale model of a commodity capitalist economy. The production/exchange model is represented by a network whose nodes are firms, workers, capitalists, and markets, and whose directed edges represent physical or monetary flows. A pair of multivariate linear equations with stochastic input parameters represent physical (supply/demand) and monetary (income/expense) balance. The input parameters yield a non-degenerate profit rate distribution across firms. Labor time and price are found to be eigenvector solutions to the respective balance equations. A simple relation is derived relating the expected value of commodity price to commodity labor content. Results of Monte Carlo simulations are consistent with the stochastic price/labor content relation.

  16. Locating the rate-limiting step for the interaction of hydrogen with Mg(0001) using density-functional theory calculations and rate theory

    DEFF Research Database (Denmark)

    Vegge, Tejs

    2004-01-01

    The dissociation of molecular hydrogen on a Mgs0001d surface and the subsequent diffusion of atomic hydrogen into the magnesium substrate is investigated using Density Functional Theory (DFT) calculations and rate theory. The minimum energy path and corresponding transition states are located usi...... to be rate-limiting for the ab- and desorption of hydrogen, respectively. Zero-point energy contributions are found to be substantial for the diffusion of atomic hydrogen, but classical rates are still found to be within an order of magnitude at room temperature.......The dissociation of molecular hydrogen on a Mgs0001d surface and the subsequent diffusion of atomic hydrogen into the magnesium substrate is investigated using Density Functional Theory (DFT) calculations and rate theory. The minimum energy path and corresponding transition states are located using...

  17. A kinetic-theory approach for computing chemical-reaction rates in upper-atmosphere hypersonic flows.

    Science.gov (United States)

    Gallis, Michael A; Bond, Ryan B; Torczynski, John R

    2009-09-28

    Recently proposed molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties (i.e., no macroscopic reaction-rate information) are investigated for chemical reactions occurring in upper-atmosphere hypersonic flows. The new models are in good agreement with the measured Arrhenius rates for near-equilibrium conditions and with both measured rates and other theoretical models for far-from-equilibrium conditions. Additionally, the new models are applied to representative combustion and ionization reactions and are in good agreement with available measurements and theoretical models. Thus, molecular-level chemistry modeling provides an accurate method for predicting equilibrium and nonequilibrium chemical-reaction rates in gases.

  18. A Theory-Based Model for Understanding Faculty Intention to Use Students Ratings to Improve Teaching in a Health Sciences Institution in Puerto Rico

    Science.gov (United States)

    Collazo, Andrés A.

    2018-01-01

    A model derived from the theory of planned behavior was empirically assessed for understanding faculty intention to use student ratings for teaching improvement. A sample of 175 professors participated in the study. The model was statistically significant and had a very large explanatory power. Instrumental attitude, affective attitude, perceived…

  19. Failure and Redemption of Statistical and Nonstatistical Rate Theories in the Hydroboration of Alkenes.

    Science.gov (United States)

    Bailey, Johnathan O; Singleton, Daniel A

    2017-11-08

    Our previous work found that canonical forms of transition state theory incorrectly predict the regioselectivity of the hydroboration of propene with BH 3 in solution. In response, it has been suggested that alternative statistical and nonstatistical rate theories can adequately account for the selectivity. This paper uses a combination of experimental and theoretical studies to critically evaluate the ability of these rate theories, as well as dynamic trajectories and newly developed localized statistical models, to predict quantitative selectivities and qualitative trends in hydroborations on a broader scale. The hydroboration of a series of terminally substituted alkenes with BH 3 was examined experimentally, and a classically unexpected trend is that the selectivity increases as the alkyl chain is lengthened far from the reactive centers. Conventional and variational transition state theories can predict neither the selectivities nor the trends. The canonical competitive nonstatistical model makes somewhat better predictions for some alkenes but fails to predict trends, and it performs poorly with an alkene chosen to test a specific prediction of the model. Added nonstatistical corrections to this model make the predictions worse. Parametrized Rice-Ramsperger-Kassel-Marcus (RRKM)-master equation calculations correctly predict the direction of the trend in selectivity versus alkene size but overpredict its magnitude, and the selectivity with large alkenes remains unpredictable with any parametrization. Trajectory studies in explicit solvent can predict selectivities without parametrization but are impractical for predicting small changes in selectivity. From a lifetime and energy analysis of the trajectories, "localized RRKM-ME" and "competitive localized noncanonical" rate models are suggested as steps toward a general model. These provide the best predictions of the experimental observations and insight into the selectivities.

  20. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  1. Quantum theory of enhanced unimolecular reaction rates below the ergodicity threshold

    International Nuclear Information System (INIS)

    Leitner, David M.; Wolynes, Peter G.

    2006-01-01

    A variety of unimolecular reactions exhibit measured rates that exceed Rice-Ramsperger-Kassel-Marcus (RRKM) predictions. We show using the local random matrix theory (LRMT) of vibrational energy flow how the quantum localization of the vibrational states of a molecule, by violating the ergodicity assumption, can give rise to such an enhancement of the apparent reaction rate. We present an illustrative calculation using LRMT for a model 12-vibrational mode organic molecule to show that below the ergodicity threshold the reaction rate may exceed many times the RRKM prediction due to quantum localization of vibrational states

  2. RATING CREATION FOR PROFESSIONAL EDUCATIONAL ORGANIZATIONS BASED ON THE ITEM RESPONSE THEORY

    Directory of Open Access Journals (Sweden)

    N. E. Erganova

    2016-01-01

    Full Text Available The aim of the investigation is to theoretically justify and describe approval of the measurement of the level of provision of educational services, education qualities and rating of vocational educational organizations.Methods. The fundamentals of methodology of the research conducted by authors are made by provisions of system approach; research on a schematization and modeling of pedagogical objects; the provision of the theory of measurement of latent variables. As the main methods of research the analysis, synthesis, the comparative analysis, statistical methods of processing of results of research are applied.Results. The paper gives a short comparative analysis of potentials of qualitative approach and strong points of the theory of latent variables in evaluating the quality of education and ratings of the investigated object. The technique of measurement of level of rendering educational services at creation of a rating of the professional educational organizations is stated.Scientific novelty. Pedagogical opportunities of the theory of measurement of latent variables are investigated; the principles of creation of ratings of the professional educational organizations are designated.Practical significance. The operational construct of the latent variable «quality of education» for the secondary professional education (SPE approved in the Perm Territory which can form base of formation of similar constructs for creation of a rating of the professional educational organizations in other regions is developed.

  3. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  4. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  5. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz

    2017-01-01

    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  6. Martingale Regressions for a Continuous Time Model of Exchange Rates

    OpenAIRE

    Guo, Zi-Yi

    2017-01-01

    One of the daunting problems in international finance is the weak explanatory power of existing theories of the nominal exchange rates, the so-called “foreign exchange rate determination puzzle”. We propose a continuous-time model to study the impact of order flow on foreign exchange rates. The model is estimated by a newly developed econometric tool based on a time-change sampling from calendar to volatility time. The estimation results indicate that the effect of order flow on exchange rate...

  7. Photoionization cross sections and Auger rates calculated by many-body perturbation theory

    International Nuclear Information System (INIS)

    Kelly, H.P.

    1976-01-01

    Methods for applying the many body perturbation theory to atomic calculations are discussed with particular emphasis on calculation of photoionization cross sections and Auger rates. Topics covered include: Rayleigh--Schroedinger theory; many body perturbation theory; calculations of photoionization cross sections; and Auger rates

  8. Random walk theory and exchange rate dynamics in transition economies

    Directory of Open Access Journals (Sweden)

    Gradojević Nikola

    2010-01-01

    Full Text Available This paper investigates the validity of the random walk theory in the Euro-Serbian dinar exchange rate market. We apply Andrew Lo and Archie MacKinlay's (1988 conventional variance ratio test and Jonathan Wright's (2000 non-parametric ranks and signs based variance ratio tests to the daily Euro/Serbian dinar exchange rate returns using the data from January 2005 - December 2008. Both types of variance ratio tests overwhelmingly reject the random walk hypothesis over the data span. To assess the robustness of our findings, we examine the forecasting performance of a non-linear, nonparametric model in the spirit of Francis Diebold and James Nason (1990 and find that it is able to significantly improve upon the random walk model, thus confirming the existence of foreign exchange market imperfections in a small transition economy such as Serbia. In the last part of the paper, we conduct a comparative study on how our results relate to those of other transition economies in the region.

  9. Joint Machine Learning and Game Theory for Rate Control in High Efficiency Video Coding.

    Science.gov (United States)

    Gao, Wei; Kwong, Sam; Jia, Yuheng

    2017-08-25

    In this paper, a joint machine learning and game theory modeling (MLGT) framework is proposed for inter frame coding tree unit (CTU) level bit allocation and rate control (RC) optimization in High Efficiency Video Coding (HEVC). First, a support vector machine (SVM) based multi-classification scheme is proposed to improve the prediction accuracy of CTU-level Rate-Distortion (R-D) model. The legacy "chicken-and-egg" dilemma in video coding is proposed to be overcome by the learning-based R-D model. Second, a mixed R-D model based cooperative bargaining game theory is proposed for bit allocation optimization, where the convexity of the mixed R-D model based utility function is proved, and Nash bargaining solution (NBS) is achieved by the proposed iterative solution search method. The minimum utility is adjusted by the reference coding distortion and frame-level Quantization parameter (QP) change. Lastly, intra frame QP and inter frame adaptive bit ratios are adjusted to make inter frames have more bit resources to maintain smooth quality and bit consumption in the bargaining game optimization. Experimental results demonstrate that the proposed MLGT based RC method can achieve much better R-D performances, quality smoothness, bit rate accuracy, buffer control results and subjective visual quality than the other state-of-the-art one-pass RC methods, and the achieved R-D performances are very close to the performance limits from the FixedQP method.

  10. Rate dependent inelastic behavior of polycrystalline solids using a dislocation model

    International Nuclear Information System (INIS)

    Werne, R.W.; Kelly, J.M.

    1980-01-01

    A rate dependent theory of polycrystalline plasticity is presented in which the solid is modeled as an isotropic continuum with internal variables. The rate of plastic deformation is shown to be a function of the deviatoric portion of the Cauchy stress tensor as well as two scalar internal variables. The scalar internal variables, which are the dislocation density and mobile fraction, are governed by rate equations which reflect the evolution of microstructural processes. The model has been incorporated into a two dimensional finite element code and several example multidimensional problems are presented which exhibit the rate dependence of the material model

  11. A CVAR scenario for a standard monetary model using theory-consistent expectations

    DEFF Research Database (Denmark)

    Juselius, Katarina

    2017-01-01

    A theory-consistent CVAR scenario describes a set of testable regularities capturing basic assumptions of the theoretical model. Using this concept, the paper considers a standard model for exchange rate determination and shows that all assumptions about the model's shock structure and steady...

  12. Testing linear growth rate formulas of non-scale endogenous growth models

    NARCIS (Netherlands)

    Ziesemer, Thomas

    2017-01-01

    Endogenous growth theory has produced formulas for steady-state growth rates of income per capita which are linear in the growth rate of the population. Depending on the details of the models, slopes and intercepts are positive, zero or negative. Empirical tests have taken over the assumption of

  13. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  14. Micromechanical modeling of rate-dependent behavior of Connective tissues.

    Science.gov (United States)

    Fallah, A; Ahmadian, M T; Firozbakhsh, K; Aghdam, M M

    2017-03-07

    In this paper, a constitutive and micromechanical model for prediction of rate-dependent behavior of connective tissues (CTs) is presented. Connective tissues are considered as nonlinear viscoelastic material. The rate-dependent behavior of CTs is incorporated into model using the well-known quasi-linear viscoelasticity (QLV) theory. A planar wavy representative volume element (RVE) is considered based on the tissue microstructure histological evidences. The presented model parameters are identified based on the available experiments in the literature. The presented constitutive model introduced to ABAQUS by means of UMAT subroutine. Results show that, monotonic uniaxial test predictions of the presented model at different strain rates for rat tail tendon (RTT) and human patellar tendon (HPT) are in good agreement with experimental data. Results of incremental stress-relaxation test are also presented to investigate both instantaneous and viscoelastic behavior of connective tissues. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A Membrane Model from Implicit Elasticity Theory

    Science.gov (United States)

    Freed, A. D.; Liao, J.; Einstein, D. R.

    2014-01-01

    A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079

  16. Field theory and the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Dudas, E [Orsay, LPT (France)

    2014-07-01

    This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.

  17. Death Rates in the Calorie Model

    Directory of Open Access Journals (Sweden)

    Martin Machay

    2016-01-01

    Full Text Available The Calorie model unifies the Classical demand and the supply in the food market. Hence, solves the major problem of Classical stationary state. It is, hence, formalization of the Classical theory of population. The model does not reflect the imperfections of reality mentioned by Malthus himself. It is the aim of this brief paper to relax some of the strong assumptions of the Calorie model to make it more realistic. As the results show the political economists were correct. The death resulting from malnutrition can occur way sooner than the stationary state itself. Moreover, progressive and retrograde movements can be easily described by the death rate derived in the paper. JEL Classification: J11, Q11, Q15, Q21, Y90.

  18. Macromolecular Rate Theory (MMRT) Provides a Thermodynamics Rationale to Underpin the Convergent Temperature Response in Plant Leaf Respiration

    Science.gov (United States)

    Liang, L. L.; Arcus, V. L.; Heskel, M.; O'Sullivan, O. S.; Weerasinghe, L. K.; Creek, D.; Egerton, J. J. G.; Tjoelker, M. G.; Atkin, O. K.; Schipper, L. A.

    2017-12-01

    Temperature is a crucial factor in determining the rates of ecosystem processes such as leaf respiration (R) - the flux of plant respired carbon dioxide (CO2) from leaves to the atmosphere. Generally, respiration rate increases exponentially with temperature as modelled by the Arrhenius equation, but a recent study (Heskel et al., 2016) showed a universally convergent temperature response of R using an empirical exponential/polynomial model whereby the exponent in the Arrhenius model is replaced by a quadratic function of temperature. The exponential/polynomial model has been used elsewhere to describe shoot respiration and plant respiration. What are the principles that underlie these empirical observations? Here, we demonstrate that macromolecular rate theory (MMRT), based on transition state theory for chemical kinetics, is equivalent to the exponential/polynomial model. We re-analyse the data from Heskel et al. 2016 using MMRT to show this equivalence and thus, provide an explanation based on thermodynamics, for the convergent temperature response of R. Using statistical tools, we also show the equivalent explanatory power of MMRT when compared to the exponential/polynomial model and the superiority of both of these models over the Arrhenius function. Three meaningful parameters emerge from MMRT analysis: the temperature at which the rate of respiration is maximum (the so called optimum temperature, Topt), the temperature at which the respiration rate is most sensitive to changes in temperature (the inflection temperature, Tinf) and the overall curvature of the log(rate) versus temperature plot (the so called change in heat capacity for the system, ). The latter term originates from the change in heat capacity between an enzyme-substrate complex and an enzyme transition state complex in enzyme-catalysed metabolic reactions. From MMRT, we find the average Topt and Tinf of R are 67.0±1.2 °C and 41.4±0.7 °C across global sites. The average curvature (average

  19. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  20. Lattice models and conformal field theories

    International Nuclear Information System (INIS)

    Saleur, H.

    1988-01-01

    Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied

  1. Extension of a Kinetic-Theory Approach for Computing Chemical-Reaction Rates to Reactions with Charged Particles

    Science.gov (United States)

    Liechty, Derek S.; Lewis, Mark J.

    2010-01-01

    Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties (i.e., no macroscopic reaction rate information) are extended to include reactions involving charged particles and electronic energy levels. The proposed extensions include ionization reactions, exothermic associative ionization reactions, endothermic and exothermic charge exchange reactions, and other exchange reactions involving ionized species. The extensions are shown to agree favorably with the measured Arrhenius rates for near-equilibrium conditions.

  2. Rate Theory for Correlated Processes: Double Jumps in Adatom Diffusion

    DEFF Research Database (Denmark)

    Jacobsen, J.; Jacobsen, Karsten Wedel; Sethna, J.

    1997-01-01

    We study the rate of activated motion over multiple barriers, in particular the correlated double jump of an adatom diffusing on a missing-row reconstructed platinum (110) surface. We develop a transition path theory, showing that the activation energy is given by the minimum-energy trajectory...... which succeeds in the double jump. We explicitly calculate this trajectory within an effective-medium molecular dynamics simulation. A cusp in the acceptance region leads to a root T prefactor for the activated rate of double jumps. Theory and numerical results agree....

  3. Putting Reaction Rates and Collision Theory in the Hands of Your Students.

    Science.gov (United States)

    Evenson, Andy

    2002-01-01

    Describes a simulation that can be used to give concrete analogies of collision theory and the factors that affect reaction rates including temperature, concentration, catalyst, and molecular orientation. The simulation works best if done as an introduction to the concepts to help prevent misconceptions about reaction rates and collision theory.…

  4. Investigating dislocation motion through a field of solutes with atomistic simulations and reaction rate theory

    International Nuclear Information System (INIS)

    Saroukhani, S.; Warner, D.H.

    2017-01-01

    The rate of thermally activated dislocation motion across a field of solutes is studied using traditional and modern atomistically informed rate theories. First, the accuracy of popular variants of the Harmonic Transition State Theory, as the most common approach, is examined by comparing predictions to direct MD simulations. It is shown that HTST predictions are grossly inaccurate due to the anharmonic effect of thermal softening. Next, the utility of the Transition Interface Sampling was examined as the method was recently shown to be effective for predicting the rate of dislocation-precipitate interactions. For dislocation-solute interactions studied here, TIS is found to be accurate only when the dislocation overcomes multiple obstacles at a time, i.e. jerky motion, and it is inaccurate in the unpinning regime where the energy barrier is of diffusive nature. It is then shown that the Partial Path TIS method - designed for diffusive barriers - provides accurate predictions in the unpinning regime. The two methods are then used to study the temperature and load dependence of the rate. It is shown that Meyer-Neldel (MN) rule prediction of the entropy barrier is not as accurate as it is in the case of dislocation-precipitate interactions. In response, an alternative model is proposed that provides an accurate prediction of the entropy barrier. This model can be combined with TST to offer an attractively simple rate prediction approach. Lastly, (PP)TIS is used to predict the Strain Rate Sensitivity (SRS) factor at experimental strain rates and the predictions are compared to experimental values.

  5. Gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Witten, E.

    1989-01-01

    Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)

  6. Absence of saturation of void growth in rate theory with anisotropic diffusion

    CERN Document Server

    Hudson, T S; Sutton, A P

    2002-01-01

    We present a first attempt at solution the problem of the growth of a single void in the presence of anisotropically diffusing radiation induced self-interstitial atom (SIA) clusters. In order to treat a distribution of voids we perform ensemble averaging over the positions of centres of voids using a mean-field approximation. In this way we are able to model physical situations in between the Standard Rate Theory (SRT) treatment of swelling (isotropic diffusion), and the purely 1-dimensional diffusion of clusters in the Production Bias Model. The background absorption by dislocations is however treated isotropically, with a bias for interstitial cluster absorption assumed similar to that of individual SIAs. We find that for moderate anisotropy, unsaturated void growth is characteristic of this anisotropic diffusion of clusters. In addition we obtain a higher initial void swelling rate than predicted by SRT whenever the diffusion is anisotropic.

  7. Classical nucleation theory in the phase-field crystal model.

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  8. Classical nucleation theory in the phase-field crystal model

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  9. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2001-01-01

    , in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...

  10. A macro-physics model of depreciation rate in economic exchange

    Science.gov (United States)

    Marmont Lobo, Rui F.; de Sousa, Miguel Rocha

    2014-02-01

    This article aims at a new approach for a known fundamental result: barter or trade increases economic value. It successfully bridges the gap between the theory of value and the exchange process attached to the transition from endowments to the equilibrium in the core and contract curve. First, we summarise the theory of value; in Section 2, we present the Edgeworth (1881) box and an axiomatic approach and in Section 3, we apply our pure exchange model. Finally (in Section 4), using our open econo-physics pure barter (EPB) model, we derive an improvement in value, which means that pure barter leads to a decline in depreciation rate.

  11. Using SAS PROC MCMC for Item Response Theory Models

    Science.gov (United States)

    Ames, Allison J.; Samonte, Kelli

    2015-01-01

    Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…

  12. What determines crime rates? An empirical test of integrated economic and sociological theories of criminal behavior

    NARCIS (Netherlands)

    Engelen, Peter Jan; Lander, Michel W.; van Essen, Marc

    Research on crime has by no means reached a definitive conclusion on which factors are related to crime rates. We contribute to the crime literature by providing an integrated empirical model of economic and sociological theories of criminal behavior and by using a very comprehensive set of

  13. Quiver gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Yagi, Junya

    2015-01-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  14. Calculus of variations in rate of reactions tax using the general pertubation theory

    International Nuclear Information System (INIS)

    Silva, F.C. da.

    1981-02-01

    A perturbation expression to calculate the variations in the rates of integral parameters (such as reaction rates) of a reactor using a Time-Independent Generalized Perturbation Theory, was developed. This theory makes use of the concepts of neutron generation and neutron importance with respect to a given process occurring in a system. The application of Time-Dependent Generalized Perturbation Theory to the calculation of Burnup, by using the expressions derived by A. Gandini, along with the perturbation expression derived in the Time Independent Generalized Perturbation Theory, is done. (Author) [pt

  15. Economic Modelling in Institutional Economic Theory

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski

    2017-06-01

    Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.

  16. Generalized Rate Theory for Void and Bubble Swelling and its Application to Plutonium Metal Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Allen, P. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wolfer, W. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-16

    In the classical rate theory for void swelling, vacancies and self-interstitials are produced by radiation in equal numbers, and in addition, thermal vacancies are also generated at the sinks, primarily at edge dislocations, at voids, and at grain boundaries. In contrast, due to the high formation energy of self-interstitials for normal metals and alloys, their thermal generation is negligible, as pointed out by Bullough and Perrin. However, recent DFT calculations of the formation energy of self-interstitial atoms in bcc metals have revealed that the sum of formation and migration energies for self-interstitials atoms (SIA) is of the same order of magnitude as for vacancies. The ratio of the activation energies for thermal generation of SIA and vacancies is presented. For fcc metals, this ratio is around three, but for bcc metals it is around 1.5. Reviewing theoretical predictions of point defect properties in δ-Pu, this ratio could possibly be less than one. As a result, thermal generation of SIA in bcc metals and in plutonium must be taken into considerations when modeling the growth of voids and of helium bubbles, and the classical rate theory (CRT) for void and bubble swelling must be extended to a generalized rate theory (GRT).

  17. Warped models in string theory

    International Nuclear Information System (INIS)

    Acharya, B.S.; Benini, F.; Valandro, R.

    2006-12-01

    Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)

  18. Attaining the rate-independent limit of a rate-dependent strain gradient plasticity theory

    DEFF Research Database (Denmark)

    El-Naaman, Salim Abdallah; Nielsen, Kim Lau; Niordson, Christian Frithiof

    2016-01-01

    The existence of characteristic strain rates in rate-dependent material models, corresponding to rate-independent model behavior, is studied within a back stress based rate-dependent higher order strain gradient crystal plasticity model. Such characteristic rates have recently been observed...... for steady-state processes, and the present study aims to demonstrate that the observations in fact unearth a more widespread phenomenon. In this work, two newly proposed back stress formulations are adopted to account for the strain gradient effects in the single slip simple shear case, and characteristic...... rates for a selected quantity are identified through numerical analysis. Evidently, the concept of a characteristic rate, within the rate-dependent material models, may help unlock an otherwise inaccessible parameter space....

  19. To Save or to Consume: Linking Growth Theory with the Keynesian Model

    Science.gov (United States)

    Kwok, Yun-kwong

    2007-01-01

    In the neoclassical growth theory, higher saving rate gives rise to higher output per capita. However, in the Keynesian model, higher saving rate causes lower consumption, which may lead to a recession. Students may ask, "Should we save or should we consume?" In most of the macroeconomics textbooks, economic growth and Keynesian economics are in…

  20. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  1. On low rank classical groups in string theory, gauge theory and matrix models

    International Nuclear Information System (INIS)

    Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun

    2004-01-01

    We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature

  2. Bayes estimation of the general hazard rate model

    International Nuclear Information System (INIS)

    Sarhan, A.

    1999-01-01

    In reliability theory and life testing models, the life time distributions are often specified by choosing a relevant hazard rate function. Here a general hazard rate function h(t)=a+bt c-1 , where c, a, b are constants greater than zero, is considered. The parameter c is assumed to be known. The Bayes estimators of (a,b) based on the data of type II/item-censored testing without replacement are obtained. A large simulation study using Monte Carlo Method is done to compare the performance of Bayes with regression estimators of (a,b). The criterion for comparison is made based on the Bayes risk associated with the respective estimator. Also, the influence of the number of failed items on the accuracy of the estimators (Bayes and regression) is investigated. Estimations for the parameters (a,b) of the linearly increasing hazard rate model h(t)=a+bt, where a, b are greater than zero, can be obtained as the special case, letting c=2

  3. Generalizability theory and item response theory

    OpenAIRE

    Glas, Cornelis A.W.; Eggen, T.J.H.M.; Veldkamp, B.P.

    2012-01-01

    Item response theory is usually applied to items with a selected-response format, such as multiple choice items, whereas generalizability theory is usually applied to constructed-response tasks assessed by raters. However, in many situations, raters may use rating scales consisting of items with a selected-response format. This chapter presents a short overview of how item response theory and generalizability theory were integrated to model such assessments. Further, the precision of the esti...

  4. Mechanism of Strain Rate Effect Based on Dislocation Theory

    International Nuclear Information System (INIS)

    Kun, Qin; Shi-Sheng, Hu; Li-Ming, Yang

    2009-01-01

    Based on dislocation theory, we investigate the mechanism of strain rate effect. Strain rate effect and dislocation motion are bridged by Orowan's relationship, and the stress dependence of dislocation velocity is considered as the dynamics relationship of dislocation motion. The mechanism of strain rate effect is then investigated qualitatively by using these two relationships although the kinematics relationship of dislocation motion is absent due to complicated styles of dislocation motion. The process of strain rate effect is interpreted and some details of strain rate effect are adequately discussed. The present analyses agree with the existing experimental results. Based on the analyses, we propose that strain rate criteria rather than stress criteria should be satisfied when a metal is fully yielded at a given strain rate. (condensed matter: structure, mechanical and thermal properties)

  5. Divided Saddle Theory: A New Idea for Rate Constant Calculation.

    Science.gov (United States)

    Daru, János; Stirling, András

    2014-03-11

    We present a theory of rare events and derive an algorithm to obtain rates from postprocessing the numerical data of a free energy calculation and the corresponding committor analysis. The formalism is based on the division of the saddle region of the free energy profile of the rare event into two adjacent segments called saddle domains. The method is built on sampling the dynamics within these regions: auxiliary rate constants are defined for the saddle domains and the absolute forward and backward rates are obtained by proper reweighting. We call our approach divided saddle theory (DST). An important advantage of our approach is that it requires only standard computational techniques which are available in most molecular dynamics codes. We demonstrate the potential of DST numerically on two examples: rearrangement of alanine-dipeptide (CH3CO-Ala-NHCH3) conformers and the intramolecular Cope reaction of the fluxional barbaralane molecule.

  6. Rate theory scenarios study on fission gas behavior of U 3 Si 2 under LOCA conditions in LWRs

    Energy Technology Data Exchange (ETDEWEB)

    Miao, Yinbin; Gamble, Kyle A.; Andersson, David; Mei, Zhi-Gang; Yacout, Abdellatif M.

    2018-01-01

    Fission gas behavior of U3Si2 under various loss-of-coolant accident (LOCA) conditions in light water reactors (LWRs) was simulated using rate theory. A rate theory model for U3Si2 that covers both steady-state operation and power transients was developed for the GRASS-SST code based on existing research reactor/ion irradiation experimental data and theoretical predictions of density functional theory (DFT) calculations. The steady-state and LOCA condition parameters were either directly provided or inspired by BISON simulations. Due to the absence of in-pile experiment data for U3Si2's fuel performance under LWR conditions at this stage of accident tolerant fuel (ATF) development, a variety of LOCA scenarios were taken into consideration to comprehensively and conservatively evaluate the fission gas behavior of U3Si2 during a LOCA.

  7. A course on basic model theory

    CERN Document Server

    Sarbadhikari, Haimanti

    2017-01-01

    This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.

  8. Modeling inflation rates and exchange rates in Ghana: application of multivariate GARCH models.

    Science.gov (United States)

    Nortey, Ezekiel Nn; Ngoh, Delali D; Doku-Amponsah, Kwabena; Ofori-Boateng, Kenneth

    2015-01-01

    This paper was aimed at investigating the volatility and conditional relationship among inflation rates, exchange rates and interest rates as well as to construct a model using multivariate GARCH DCC and BEKK models using Ghana data from January 1990 to December 2013. The study revealed that the cumulative depreciation of the cedi to the US dollar from 1990 to 2013 is 7,010.2% and the yearly weighted depreciation of the cedi to the US dollar for the period is 20.4%. There was evidence that, the fact that inflation rate was stable, does not mean that exchange rates and interest rates are expected to be stable. Rather, when the cedi performs well on the forex, inflation rates and interest rates react positively and become stable in the long run. The BEKK model is robust to modelling and forecasting volatility of inflation rates, exchange rates and interest rates. The DCC model is robust to model the conditional and unconditional correlation among inflation rates, exchange rates and interest rates. The BEKK model, which forecasted high exchange rate volatility for the year 2014, is very robust for modelling the exchange rates in Ghana. The mean equation of the DCC model is also robust to forecast inflation rates in Ghana.

  9. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  10. Factors influencing variation in physician adenoma detection rates: a theory-based approach for performance improvement.

    Science.gov (United States)

    Atkins, Louise; Hunkeler, Enid M; Jensen, Christopher D; Michie, Susan; Lee, Jeffrey K; Doubeni, Chyke A; Zauber, Ann G; Levin, Theodore R; Quinn, Virginia P; Corley, Douglas A

    2016-03-01

    Interventions to improve physician adenoma detection rates for colonoscopy have generally not been successful, and there are little data on the factors contributing to variation that may be appropriate targets for intervention. We sought to identify factors that may influence variation in detection rates by using theory-based tools for understanding behavior. We separately studied gastroenterologists and endoscopy nurses at 3 Kaiser Permanente Northern California medical centers to identify potentially modifiable factors relevant to physician adenoma detection rate variability by using structured group interviews (focus groups) and theory-based tools for understanding behavior and eliciting behavior change: the Capability, Opportunity, and Motivation behavior model; the Theoretical Domains Framework; and the Behavior Change Wheel. Nine factors potentially associated with adenoma detection rate variability were identified, including 6 related to capability (uncertainty about which types of polyps to remove, style of endoscopy team leadership, compromised ability to focus during an examination due to distractions, examination technique during withdrawal, difficulty detecting certain types of adenomas, and examiner fatigue and pain), 2 related to opportunity (perceived pressure due to the number of examinations expected per shift and social pressure to finish examinations before scheduled breaks or the end of a shift), and 1 related to motivation (valuing a meticulous examination as the top priority). Examples of potential intervention strategies are provided. By using theory-based tools, this study identified several novel and potentially modifiable factors relating to capability, opportunity, and motivation that may contribute to adenoma detection rate variability and be appropriate targets for future intervention trials. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  11. A new safety assessment model for shallow land burial of LLW based on multicomponent sorption theory

    International Nuclear Information System (INIS)

    Katoh, N.; Asano, T.; Tasaka, H.

    1984-01-01

    A new model on the radionuclide migration in underground environment is developed based on ''multicomponent sorption theory''. The model is capable of predicting the behaviors of the coexisting materials in soil-ground water system as ''multicomponent sorption phenomena'' and also predicting the radinuclide migration affected by the changes of concentrations of coexisting materials. The model is not a ''statistical model'' but a ''chemical model'' based on the ''ion exchange theory'' and ''adsorption theory''. Additionally, the model is a ''kinetic model'' capable of estimating the effect of ''rate of sorption'' on the radionuclide migration. The validity of the model was checked by the results of column experiments for sorption. Finally, sample calculations on the radionuclide migration in reference shallow land burial site were carried out for demonstration

  12. Rate theory of solvent exchange and kinetics of Li(+) - BF4 (-)/PF6 (-) ion pairs in acetonitrile.

    Science.gov (United States)

    Dang, Liem X; Chang, Tsun-Mei

    2016-09-07

    In this paper, we describe our efforts to apply rate theories in studies of solvent exchange around Li(+) and the kinetics of ion pairings in lithium-ion batteries (LIBs). We report one of the first computer simulations of the exchange dynamics around solvated Li(+) in acetonitrile (ACN), which is a common solvent used in LIBs. We also provide details of the ion-pairing kinetics of Li(+)-[BF4] and Li(+)-[PF6] in ACN. Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine the ACN exchange process between the first and second solvation shells around Li(+). We calculate exchange rates using transition state theory and weighted them with the transmission coefficients determined by the reactive flux, Impey, Madden, and McDonald approaches, and Grote-Hynes theory. We found the relaxation times changed from 180 ps to 4600 ps and from 30 ps to 280 ps for Li(+)-[BF4] and Li(+)-[PF6] ion pairs, respectively. These results confirm that the solvent response to the kinetics of ion pairing is significant. Our results also show that, in addition to affecting the free energy of solvation into ACN, the anion type also should significantly influence the kinetics of ion pairing. These results will increase our understanding of the thermodynamic and kinetic properties of LIB systems.

  13. Theory and theory-based models for the pedestal, edge stability and ELMs in tokamaks

    International Nuclear Information System (INIS)

    Guzdar, P.N.; Mahajan, S.M.; Yoshida, Z.; Dorland, W.; Rogers, B.N.; Bateman, G.; Kritz, A.H.; Pankin, A.; Voitsekhovitch, I.; Onjun, T.; Snyder, S.

    2005-01-01

    Theories for equilibrium and stability of H-modes, and models for use within integrated modeling codes with the objective of predicting the height, width and shape of the pedestal at the edge of H-mode plasmas in tokamaks, as well as the onset and frequency of Edge Localized Modes (ELMs), are developed. A theory model for relaxed plasma states with flow, which uses two-fluid Hall-MHD equations, predicts that the natural scale length of the pedestal is the ion skin depth and the pedestal width is larger than the ion poloidal gyro-radius, in agreement with experimental observations. Computations with the GS2 code are used to identify micro-instabilities, such as electron drift waves, that survive the strong flow shear, diamagnetic flows, and magnetic shear that are characteristic of the pedestal. Other instabilities on the pedestal and gyro-radius scale, such as the Kelvin-Helmholtz instability, are also investigated. Time-dependent integrated modeling simulations are used to follow the transition from L-mode to H-mode and the subsequent evolution of ELMs as the heating power is increased. The flow shear stabilization that produces the transport barrier at the edge of the plasma reduces different modes of anomalous transport and, consequently, different channels of transport at different rates. ELM crashes are triggered in the model by pressure-driven ballooning modes or by current-driven peeling modes. (author)

  14. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  15. Constraint theory multidimensional mathematical model management

    CERN Document Server

    Friedman, George J

    2017-01-01

    Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...

  16. THE EVOLUTION OF CURRENCY RELATIONS IN THE LIGHT OF MAJOR EXCHANGE RATE ADJUSTMENT THEORIES

    Directory of Open Access Journals (Sweden)

    Sergiy TKACH

    2014-07-01

    Full Text Available This paper examines the impact of major exchange rate adjustment theories on the global monetary system. The reasons of the previous organization forms of monetary relations collapse at the global level are defined. The main achievements and failures of major exchange rate theories are described.

  17. Modeling and Predicting the EUR/USD Exchange Rate: The Role of Nonlinear Adjustments to Purchasing Power Parity

    OpenAIRE

    Jesús Crespo Cuaresma; Anna Orthofer

    2010-01-01

    Reliable medium-term forecasts are essential for forward-looking monetary policy decisionmaking. Traditionally, predictions of the exchange rate tend to be linked to the equilibrium concept implied by the purchasing power parity (PPP) theory. In particular, the traditional benchmark for exchange rate models is based on a linear adjustment of the exchange rate to the level implied by PPP. In the presence of aggregation effects, transaction costs or uncertainty, however, economic theory predict...

  18. Application of decision-making theory to the regulation of muscular work rate during self-paced competitive endurance activity.

    Science.gov (United States)

    Renfree, Andrew; Martin, Louise; Micklewright, Dominic; St Clair Gibson, Alan

    2014-02-01

    Successful participation in competitive endurance activities requires continual regulation of muscular work rate in order to maximise physiological performance capacities, meaning that individuals must make numerous decisions with regards to the muscular work rate selected at any point in time. Decisions relating to the setting of appropriate goals and the overall strategic approach to be utilised are made prior to the commencement of an event, whereas tactical decisions are made during the event itself. This review examines current theories of decision-making in an attempt to explain the manner in which regulation of muscular work is achieved during athletic activity. We describe rational and heuristic theories, and relate these to current models of regulatory processes during self-paced exercise in an attempt to explain observations made in both laboratory and competitive environments. Additionally, we use rational and heuristic theories in an attempt to explain the influence of the presence of direct competitors on the quality of the decisions made during these activities. We hypothesise that although both rational and heuristic models can plausibly explain many observed behaviours in competitive endurance activities, the complexity of the environment in which such activities occur would imply that effective rational decision-making is unlikely. However, at present, many proposed models of the regulatory process share similarities with rational models. We suggest enhanced understanding of the decision-making process during self-paced activities is crucial in order to improve the ability to understand regulation of performance and performance outcomes during athletic activity.

  19. Dark matter relics and the expansion rate in scalar-tensor theories

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, Bhaskar; Jimenez, Esteban [Mitchell Institute for Fundamental Physics and Astronomy, Department of Physics and Astronomy, Texas A and M University, College Station, TX 77843 (United States); Zavala, Ivonne, E-mail: dutta@physics.tamu.edu, E-mail: este1985@physics.tamu.edu, E-mail: e.i.zavalacarrasco@swansea.ac.uk [Department of Physics, Swansea University, Singleton Park, Swansea, SA2 8PP (United Kingdom)

    2017-06-01

    We study the impact of a modified expansion rate on the dark matter relic abundance in a class of scalar-tensor theories. The scalar-tensor theories we consider are motivated from string theory constructions, which have conformal as well as disformally coupled matter to the scalar. We investigate the effects of such a conformal coupling to the dark matter relic abundance for a wide range of initial conditions, masses and cross-sections. We find that exploiting all possible initial conditions, the annihilation cross-section required to satisfy the dark matter content can differ from the thermal average cross-section in the standard case. We also study the expansion rate in the disformal case and find that physically relevant solutions require a nontrivial relation between the conformal and disformal functions. We study the effects of the disformal coupling in an explicit example where the disformal function is quadratic.

  20. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  1. Neutron Star Models in Alternative Theories of Gravity

    Science.gov (United States)

    Manolidis, Dimitrios

    We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.

  2. Gross domestic product growth rates as confined Lévy flights: Towards a unifying theory of economic growth rate fluctuations

    Science.gov (United States)

    Lera, Sandro Claudio; Sornette, Didier

    2018-01-01

    A model that combines economic growth rate fluctuations at the microscopic and macroscopic levels is presented. At the microscopic level, firms are growing at different rates while also being exposed to idiosyncratic shocks at the firm and sector levels. We describe such fluctuations as independent Lévy-stable fluctuations, varying over multiple orders of magnitude. These fluctuations are aggregated and measured at the macroscopic level in averaged economic output quantities such as GDP. A fundamental question is thereby to what extent individual firm size fluctuations can have a noticeable impact on the overall economy. We argue that this question can be answered by considering the Lévy fluctuations as embedded in a steep confining potential well, ensuring nonlinear mean-reversal behavior, without having to rely on microscopic details of the system. The steepness of the potential well directly controls the extent to which idiosyncratic shocks to firms and sectors are damped at the level of the economy. Additionally, the theory naturally accounts for business cycles, represented in terms of a bimodal economic output distribution and thus connects two so far unrelated fields in economics. By analyzing 200 years of U.S. gross domestic product growth rates, we find that the model is in good agreement with the data.

  3. Chartist Trading in Exchange Rate Theory

    OpenAIRE

    Selander, Carina

    2006-01-01

    This thesis consists of four papers, of which paper 1 and 4 are co-written with Mikael Bask. Paper [1] implements chartists trading in a sticky-price monetary model for determining the exchange rate. It is demonstrated that chartists cause the exchange rate to "overshoot the overshooting equilibrium" of a sticky-price monetary model. Chartists base their trading on a short-long moving average. The importance of technical trading depends inversely on the time horizon in currency trade. The exc...

  4. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  5. Theory and modeling of active brazing.

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  6. Pipe fracture evaluations for leak-rate detection: Probabilistic models

    International Nuclear Information System (INIS)

    Rahman, S.; Wilkowski, G.; Ghadiali, N.

    1993-01-01

    This is the second in series of three papers generated from studies on nuclear pipe fracture evaluations for leak-rate detection. This paper focuses on the development of novel probabilistic models for stochastic performance evaluation of degraded nuclear piping systems. It was accomplished here in three distinct stages. First, a statistical analysis was conducted to characterize various input variables for thermo-hydraulic analysis and elastic-plastic fracture mechanics, such as material properties of pipe, crack morphology variables, and location of cracks found in nuclear piping. Second, a new stochastic model was developed to evaluate performance of degraded piping systems. It is based on accurate deterministic models for thermo-hydraulic and fracture mechanics analyses described in the first paper, statistical characterization of various input variables, and state-of-the-art methods of modem structural reliability theory. From this model. the conditional probability of failure as a function of leak-rate detection capability of the piping systems can be predicted. Third, a numerical example was presented to illustrate the proposed model for piping reliability analyses. Results clearly showed that the model provides satisfactory estimates of conditional failure probability with much less computational effort when compared with those obtained from Monte Carlo simulation. The probabilistic model developed in this paper will be applied to various piping in boiling water reactor and pressurized water reactor plants for leak-rate detection applications

  7. Lepton number violation in theories with a large number of standard model copies

    International Nuclear Information System (INIS)

    Kovalenko, Sergey; Schmidt, Ivan; Paes, Heinrich

    2011-01-01

    We examine lepton number violation (LNV) in theories with a saturated black hole bound on a large number of species. Such theories have been advocated recently as a possible solution to the hierarchy problem and an explanation of the smallness of neutrino masses. On the other hand, the violation of the lepton number can be a potential phenomenological problem of this N-copy extension of the standard model as due to the low quantum gravity scale black holes may induce TeV scale LNV operators generating unacceptably large rates of LNV processes. We show, however, that this issue can be avoided by introducing a spontaneously broken U 1(B-L) . Then, due to the existence of a specific compensation mechanism between contributions of different Majorana neutrino states, LNV processes in the standard model copy become extremely suppressed with rates far beyond experimental reach.

  8. On the algebraic theory of kink sectors: Application to quantum field theory models and collision theory

    International Nuclear Information System (INIS)

    Schlingemann, D.

    1996-10-01

    Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)

  9. Analytical Modeling of the High Strain Rate Deformation of Polymer Matrix Composites

    Science.gov (United States)

    Goldberg, Robert K.; Roberts, Gary D.; Gilat, Amos

    2003-01-01

    The results presented here are part of an ongoing research program to develop strain rate dependent deformation and failure models for the analysis of polymer matrix composites subject to high strain rate impact loads. State variable constitutive equations originally developed for metals have been modified in order to model the nonlinear, strain rate dependent deformation of polymeric matrix materials. To account for the effects of hydrostatic stresses, which are significant in polymers, the classical 5 plasticity theory definitions of effective stress and effective plastic strain are modified by applying variations of the Drucker-Prager yield criterion. To verify the revised formulation, the shear and tensile deformation of a representative toughened epoxy is analyzed across a wide range of strain rates (from quasi-static to high strain rates) and the results are compared to experimentally obtained values. For the analyzed polymers, both the tensile and shear stress-strain curves computed using the analytical model correlate well with values obtained through experimental tests. The polymer constitutive equations are implemented within a strength of materials based micromechanics method to predict the nonlinear, strain rate dependent deformation of polymer matrix composites. In the micromechanics, the unit cell is divided up into a number of independently analyzed slices, and laminate theory is then applied to obtain the effective deformation of the unit cell. The composite mechanics are verified by analyzing the deformation of a representative polymer matrix composite (composed using the representative polymer analyzed for the correlation of the polymer constitutive equations) for several fiber orientation angles across a variety of strain rates. The computed values compare favorably to experimentally obtained results.

  10. A harmonic transition state theory model for defect initiation in crystals

    International Nuclear Information System (INIS)

    Delph, T J; Cao, P; Park, H S; Zimmerman, J A

    2013-01-01

    We outline here a model for the initiation of defects in crystals based upon harmonic transition state theory (hTST). This model combines a previously developed model for zero-temperature defect initiation with a multi-dimensional hTST model that is capable of accurately predicting the effects of temperature and loading rate upon defect initiation. The model has several features that set it apart from previous efforts along these lines, most notably a straightforward method of determining the energy barrier between adjacent equilibrium states that does not depend upon a priori information concerning the nature of the defect. We apply the model to two examples, triaxial stretching of a perfect fcc crystal and nanoindentation of a gold substrate. Very good agreement is found between the predictions of the model and independent molecular dynamics (MD) simulations. Among other things, the model predicts a strong dependence of the defect initiation behavior upon the loading parameter. A very attractive feature of this model is that it is valid for arbitrarily slow loading rates, in particular loading rates achievable in the laboratory, and suffers from none of the limitations in this regard inherent in MD simulations. (paper)

  11. Effects of turbulence on the geometric collision rate of sedimenting droplets. Part 2. Theory and parameterization

    International Nuclear Information System (INIS)

    Ayala, Orlando; Rosa, Bogdan; Wang Lianping

    2008-01-01

    The effect of air turbulence on the geometric collision kernel of cloud droplets can be predicted if the effects of air turbulence on two kinematic pair statistics can be modeled. The first is the average radial relative velocity and the second is the radial distribution function (RDF). A survey of the literature shows that no theory is available for predicting the radial relative velocity of finite-inertia sedimenting droplets in a turbulent flow. In this paper, a theory for the radial relative velocity is developed, using a statistical approach assuming that gravitational sedimentation dominates the relative motion of droplets before collision. In the weak-inertia limit, the theory reveals a new term making a positive contribution to the radial relative velocity resulting from a coupling between sedimentation and air turbulence on the motion of finite-inertia droplets. The theory is compared to the direct numerical simulations (DNS) results in part 1, showing a reasonable agreement with the DNS data for bidisperse cloud droplets. For droplets larger than 30 μm in radius, a nonlinear drag (NLD) can also be included in the theory in terms of an effective inertial response time and an effective terminal velocity. In addition, an empirical model is developed to quantify the RDF. This, together with the theory for radial relative velocity, provides a parameterization for the turbulent geometric collision kernel. Using this integrated model, we find that turbulence could triple the geometric collision kernel, relative to the stagnant air case, for a droplet pair of 10 and 20 μm sedimenting through a cumulus cloud at R λ =2x10 4 and ε=600 cm 2 s -3 . For the self-collisions of 20 μm droplets, the collision kernel depends sensitively on the flow dissipation rate

  12. The single-process biochemical reaction of Rubisco: a unified theory and model with the effects of irradiance, CO₂ and rate-limiting step on the kinetics of C₃ and C₄ photosynthesis from gas exchange.

    Science.gov (United States)

    Farazdaghi, Hadi

    2011-02-01

    Photosynthesis is the origin of oxygenic life on the planet, and its models are the core of all models of plant biology, agriculture, environmental quality and global climate change. A theory is presented here, based on single process biochemical reactions of Rubisco, recognizing that: In the light, Rubisco activase helps separate Rubisco from the stored ribulose-1,5-bisphosphate (RuBP), activates Rubisco with carbamylation and addition of Mg²(+), and then produces two products, in two steps: (Step 1) Reaction of Rubisco with RuBP produces a Rubisco-enediol complex, which is the carboxylase-oxygenase enzyme (Enco) and (Step 2) Enco captures CO₂ and/or O₂ and produces intermediate products leading to production and release of 3-phosphoglycerate (PGA) and Rubisco. PGA interactively controls (1) the carboxylation-oxygenation, (2) electron transport, and (3) triosephosphate pathway of the Calvin-Benson cycle that leads to the release of glucose and regeneration of RuBP. Initially, the total enzyme participates in the two steps of the reaction transitionally and its rate follows Michaelis-Menten kinetics. But, for a continuous steady state, Rubisco must be divided into two concurrently active segments for the two steps. This causes a deviation of the steady state from the transitional rate. Kinetic models are developed that integrate the transitional and the steady state reactions. They are tested and successfully validated with verifiable experimental data. The single-process theory is compared to the widely used two-process theory of Farquhar et al. (1980. Planta 149, 78-90), which assumes that the carboxylation rate is either Rubisco-limited at low CO₂ levels such as CO₂ compensation point, or RuBP regeneration-limited at high CO₂. Since the photosynthesis rate cannot increase beyond the two-process theory's Rubisco limit at the CO₂ compensation point, net photosynthesis cannot increase above zero in daylight, and since there is always respiration at

  13. Toric Methods in F-Theory Model Building

    Directory of Open Access Journals (Sweden)

    Johanna Knapp

    2011-01-01

    Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.

  14. Non-linear σ-models and string theories

    International Nuclear Information System (INIS)

    Sen, A.

    1986-10-01

    The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs

  15. Tantalum strength model incorporating temperature, strain rate and pressure

    Science.gov (United States)

    Lim, Hojun; Battaile, Corbett; Brown, Justin; Lane, Matt

    Tantalum is a body-centered-cubic (BCC) refractory metal that is widely used in many applications in high temperature, strain rate and pressure environments. In this work, we propose a physically-based strength model for tantalum that incorporates effects of temperature, strain rate and pressure. A constitutive model for single crystal tantalum is developed based on dislocation kink-pair theory, and calibrated to measurements on single crystal specimens. The model is then used to predict deformations of single- and polycrystalline tantalum. In addition, the proposed strength model is implemented into Sandia's ALEGRA solid dynamics code to predict plastic deformations of tantalum in engineering-scale applications at extreme conditions, e.g. Taylor impact tests and Z machine's high pressure ramp compression tests, and the results are compared with available experimental data. Sandia National Laboratories is a multi program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  16. Adapting the Theory of Visual Attention (TVA) to model auditory attention

    DEFF Research Database (Denmark)

    Roberts, Katherine L.; Andersen, Tobias; Kyllingsbæk, Søren

    Mathematical and computational models have provided useful insights into normal and impaired visual attention, but less progress has been made in modelling auditory attention. We are developing a Theory of Auditory Attention (TAA), based on an influential visual model, the Theory of Visual...... Attention (TVA). We report that TVA provides a good fit to auditory data when the stimuli are closely matched to those used in visual studies. In the basic visual TVA task, participants view a brief display of letters and are asked to report either all of the letters (whole report) or a subset of letters (e...... the auditory data, producing good estimates of the rate at which information is encoded (C), the minimum exposure duration required for processing to begin (t0), and the relative attentional weight to targets versus distractors (α). Future work will address the issue of target-distractor confusion, and extend...

  17. Lectures on algebraic model theory

    CERN Document Server

    Hart, Bradd

    2001-01-01

    In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.

  18. On the theory of interest rate policy

    Directory of Open Access Journals (Sweden)

    Heinz-Peter Spahn

    2001-12-01

    Full Text Available A new consensus in the theory of monetary policy has been reached pointing to the pivotal role of interest rates that are set in accordance with central banks' reaction functions. The decisive criterion of assessing the Taylor rule, inflation and monetary targeting is not the macrotheoretic foundation of these concepts. They serve as "languages" coordinating heterogeneous beliefs among policy makers and private agents, and should also allow rule-based discretionary policies when markets are in need of leadership. Contrary to the ECB dogma, the Fed is right to have an eye on the risks of inflation and unemployment.

  19. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...

  20. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  1. Chern-Simons matrix models, two-dimensional Yang-Mills theory and the Sutherland model

    International Nuclear Information System (INIS)

    Szabo, Richard J; Tierz, Miguel

    2010-01-01

    We derive some new relationships between matrix models of Chern-Simons gauge theory and of two-dimensional Yang-Mills theory. We show that q-integration of the Stieltjes-Wigert matrix model is the discrete matrix model that describes q-deformed Yang-Mills theory on S 2 . We demonstrate that the semiclassical limit of the Chern-Simons matrix model is equivalent to the Gross-Witten model in the weak-coupling phase. We study the strong-coupling limit of the unitary Chern-Simons matrix model and show that it too induces the Gross-Witten model, but as a first-order deformation of Dyson's circular ensemble. We show that the Sutherland model is intimately related to Chern-Simons gauge theory on S 3 , and hence to q-deformed Yang-Mills theory on S 2 . In particular, the ground-state wavefunction of the Sutherland model in its classical equilibrium configuration describes the Chern-Simons free energy. The correspondence is extended to Wilson line observables and to arbitrary simply laced gauge groups.

  2. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  3. Item Response Theory Modeling and Categorical Regression Analyses of the Five-Factor Model Rating Form: A Study on Italian Community-Dwelling Adolescent Participants and Adult Participants.

    Science.gov (United States)

    Fossati, Andrea; Widiger, Thomas A; Borroni, Serena; Maffei, Cesare; Somma, Antonella

    2017-06-01

    To extend the evidence on the reliability and construct validity of the Five-Factor Model Rating Form (FFMRF) in its self-report version, two independent samples of Italian participants, which were composed of 510 adolescent high school students and 457 community-dwelling adults, respectively, were administered the FFMRF in its Italian translation. Adolescent participants were also administered the Italian translation of the Borderline Personality Features Scale for Children-11 (BPFSC-11), whereas adult participants were administered the Italian translation of the Triarchic Psychopathy Measure (TriPM). Cronbach α values were consistent with previous findings; in both samples, average interitem r values indicated acceptable internal consistency for all FFMRF scales. A multidimensional graded item response theory model indicated that the majority of FFMRF items had adequate discrimination parameters; information indices supported the reliability of the FFMRF scales. Both categorical (i.e., item-level) and scale-level regression analyses suggested that the FFMRF scores may predict a nonnegligible amount of variance in the BPFSC-11 total score in adolescent participants, and in the TriPM scale scores in adult participants.

  4. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  5. Narrative theories as computational models: reader-oriented theory and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Galloway, P.

    1983-12-01

    In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.

  6. Generalized algebra-valued models of set theory

    NARCIS (Netherlands)

    Löwe, B.; Tarafder, S.

    2015-01-01

    We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.

  7. Prediction on corrosion rate of pipe in nuclear power system based on optimized grey theory

    International Nuclear Information System (INIS)

    Chen Yonghong; Zhang Dafa; Chen Dengke; Jiang Wei

    2007-01-01

    For the prediction of corrosion rate of pipe in nuclear power system, the pre- diction error from the grey theory is greater, so a new method, optimized grey theory was presented in the paper. A comparison among predicted results from present and other methods was carried out, and it is seem that optimized grey theory is correct and effective for the prediction of corrosion rate of pipe in nuclear power system, and it provides a fundamental basis for the maintenance of pipe in nuclear power system. (authors)

  8. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  9. An analytical model of nonproportional scintillator light yield in terms of recombination rates

    International Nuclear Information System (INIS)

    Bizarri, G.; Moses, W. W.; Singh, J.; Vasil'ev, A. N.; Williams, R. T.

    2009-01-01

    Analytical expressions for the local light yield as a function of the local deposited energy (-dE/dx) and total scintillation yield integrated over the track of an electron of initial energy E are derived from radiative and/or nonradiative rates of first through third order in density of electronic excitations. The model is formulated in terms of rate constants, some of which can be determined independently from time-resolved spectroscopy and others estimated from measured light yield efficiency as a constraint assumed to apply in each kinetic order. The rates and parameters are used in the theory to calculate scintillation yield versus primary electron energy for comparison to published experimental results on four scintillators. Influence of the track radius on the yield is also discussed. Results are found to be qualitatively consistent with the observed scintillation light yield. The theory can be applied to any scintillator if the rates of the radiative and nonradiative processes are known

  10. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  11. From Landau's hydrodynamical model to field theory model to field theory models of multiparticle production: a tribute to Peter

    International Nuclear Information System (INIS)

    Cooper, F.

    1996-01-01

    We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations

  12. Rate theory of ion pairing at the water liquid-vapor interface: A case of sodium iodide

    Science.gov (United States)

    Dang, Liem X.; Schenter, Gregory K.

    2018-06-01

    Studies on ion pairing at interfaces have been intensified recently because of their importance in many chemical reactive phenomena, such as ion-ion interactions that are affected by interfaces and their influence on kinetic processes. In this study, we performed simulations to examine the thermodynamics and kinetics of small polarizable sodium iodide ions in the bulk and near the water liquid-vapor interface. Using classical transition state theory, we calculated the dissociation rates and corrected them with transmission coefficients obtained from the reactive flux formalism and Grote-Hynes theory. Our results show that in addition to affecting the free energy of ions in solution, the interfacial environments significantly influence the kinetics of ion pairing. The results on the relaxation time obtained using the reactive flux formalism and Grote-Hynes theory present an unequivocal picture that the interface suppresses ion dissociation. The effects of the use of molecular models on the ion interactions as well as the ion-pair configurations at the interface are also quantified and discussed.

  13. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  14. M-Theory Model-Building and Proton Stability

    CERN Document Server

    Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.

    1998-01-01

    We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  15. M-theory model-building and proton stability

    International Nuclear Information System (INIS)

    Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens

    1997-09-01

    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory

  16. General Theory of Decoy-State Quantum Cryptography with Dark Count Rate Fluctuation

    International Nuclear Information System (INIS)

    Xiang, Gao; Shi-Hai, Sun; Lin-Mei, Liang

    2009-01-01

    The existing theory of decoy-state quantum cryptography assumes that the dark count rate is a constant, but in practice there exists fluctuation. We develop a new scheme of the decoy state, achieve a more practical key generation rate in the presence of fluctuation of the dark count rate, and compare the result with the result of the decoy-state without fluctuation. It is found that the key generation rate and maximal secure distance will be decreased under the influence of the fluctuation of the dark count rate

  17. Extended Nambu models: Their relation to gauge theories

    Science.gov (United States)

    Escobar, C. A.; Urrutia, L. F.

    2017-05-01

    Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.

  18. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  19. "Depletion": A Game with Natural Rules for Teaching Reaction Rate Theory.

    Science.gov (United States)

    Olbris, Donald J.; Herzfeld, Judith

    2002-01-01

    Depletion is a game that reinforces central concepts of reaction rate theory through simulation. Presents the game with a set of follow-up questions suitable for either a quiz or discussion. Also describes student reaction to the game. (MM)

  20. Kinetic aspects of the embedded clusters: Reaction - Rate Theory

    International Nuclear Information System (INIS)

    Despa, F.; Apostol, M.

    1995-07-01

    The main stages of the cluster growth process are reviewed using Reaction - Rate Theory. The precipitation stage is shown as a relaxation of the solute towards a cluster state characterized by a higher stability. The kinetic of the late stage of phase separation, the coarsening process, is analyzed by an off-centre diffusion mechanism. The theoretical results are compared to the experimental ones. (author). 37 refs, 6 figs

  1. Catastrophe Theory: A Unified Model for Educational Change.

    Science.gov (United States)

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  2. Interest Rates and Coupon Bonds in Quantum Finance

    Science.gov (United States)

    Baaquie, Belal E.

    2009-09-01

    1. Synopsis; 2. Interest rates and coupon bonds; 3. Options and option theory; 4. Interest rate and coupon bond options; 5. Quantum field theory of bond forward interest rates; 6. Libor Market Model of interest rates; 7. Empirical analysis of forward interest rates; 8. Libor Market Model of interest rate options; 9. Numeraires for bond forward interest rates; 10. Empirical analysis of interest rate caps; 11. Coupon bond European and Asian options; 12. Empirical analysis of interest rate swaptions; 13. Correlation of coupon bond options; 14. Hedging interest rate options; 15. Interest rate Hamiltonian and option theory; 16. American options for coupon bonds and interest rates; 17. Hamiltonian derivation of coupon bond options; Appendixes; Glossaries; List of symbols; Reference; Index.

  3. An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories

    International Nuclear Information System (INIS)

    Schiappa, Ricardo; Wyllard, Niclas

    2010-01-01

    We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.

  4. Matrix models as non-commutative field theories on R3

    International Nuclear Information System (INIS)

    Livine, Etera R

    2009-01-01

    In the context of spin foam models for quantum gravity, group field theories are a useful tool allowing on the one hand a non-perturbative formulation of the partition function and on the other hand admitting an interpretation as generalized matrix models. Focusing on 2d group field theories, we review their explicit relation to matrix models and show their link to a class of non-commutative field theories invariant under a quantum-deformed 3d Poincare symmetry. This provides a simple relation between matrix models and non-commutative geometry. Moreover, we review the derivation of effective 2d group field theories with non-trivial propagators from Boulatov's group field theory for 3d quantum gravity. Besides the fact that this gives a simple and direct derivation of non-commutative field theories for the matter dynamics coupled to (3d) quantum gravity, these effective field theories can be expressed as multi-matrix models with a non-trivial coupling between matrices of different sizes. It should be interesting to analyze this new class of theories, both from the point of view of matrix models as integrable systems and for the study of non-commutative field theories.

  5. Modeling of mitochondria bioenergetics using a composable chemiosmotic energy transduction rate law: theory and experimental validation.

    Directory of Open Access Journals (Sweden)

    Ivan Chang

    Full Text Available Mitochondrial bioenergetic processes are central to the production of cellular energy, and a decrease in the expression or activity of enzyme complexes responsible for these processes can result in energetic deficit that correlates with many metabolic diseases and aging. Unfortunately, existing computational models of mitochondrial bioenergetics either lack relevant kinetic descriptions of the enzyme complexes, or incorporate mechanisms too specific to a particular mitochondrial system and are thus incapable of capturing the heterogeneity associated with these complexes across different systems and system states. Here we introduce a new composable rate equation, the chemiosmotic rate law, that expresses the flux of a prototypical energy transduction complex as a function of: the saturation kinetics of the electron donor and acceptor substrates; the redox transfer potential between the complex and the substrates; and the steady-state thermodynamic force-to-flux relationship of the overall electro-chemical reaction. Modeling of bioenergetics with this rate law has several advantages: (1 it minimizes the use of arbitrary free parameters while featuring biochemically relevant parameters that can be obtained through progress curves of common enzyme kinetics protocols; (2 it is modular and can adapt to various enzyme complex arrangements for both in vivo and in vitro systems via transformation of its rate and equilibrium constants; (3 it provides a clear association between the sensitivity of the parameters of the individual complexes and the sensitivity of the system's steady-state. To validate our approach, we conduct in vitro measurements of ETC complex I, III, and IV activities using rat heart homogenates, and construct an estimation procedure for the parameter values directly from these measurements. In addition, we show the theoretical connections of our approach to the existing models, and compare the predictive accuracy of the rate law with

  6. Modeling of mitochondria bioenergetics using a composable chemiosmotic energy transduction rate law: theory and experimental validation.

    Science.gov (United States)

    Chang, Ivan; Heiske, Margit; Letellier, Thierry; Wallace, Douglas; Baldi, Pierre

    2011-01-01

    Mitochondrial bioenergetic processes are central to the production of cellular energy, and a decrease in the expression or activity of enzyme complexes responsible for these processes can result in energetic deficit that correlates with many metabolic diseases and aging. Unfortunately, existing computational models of mitochondrial bioenergetics either lack relevant kinetic descriptions of the enzyme complexes, or incorporate mechanisms too specific to a particular mitochondrial system and are thus incapable of capturing the heterogeneity associated with these complexes across different systems and system states. Here we introduce a new composable rate equation, the chemiosmotic rate law, that expresses the flux of a prototypical energy transduction complex as a function of: the saturation kinetics of the electron donor and acceptor substrates; the redox transfer potential between the complex and the substrates; and the steady-state thermodynamic force-to-flux relationship of the overall electro-chemical reaction. Modeling of bioenergetics with this rate law has several advantages: (1) it minimizes the use of arbitrary free parameters while featuring biochemically relevant parameters that can be obtained through progress curves of common enzyme kinetics protocols; (2) it is modular and can adapt to various enzyme complex arrangements for both in vivo and in vitro systems via transformation of its rate and equilibrium constants; (3) it provides a clear association between the sensitivity of the parameters of the individual complexes and the sensitivity of the system's steady-state. To validate our approach, we conduct in vitro measurements of ETC complex I, III, and IV activities using rat heart homogenates, and construct an estimation procedure for the parameter values directly from these measurements. In addition, we show the theoretical connections of our approach to the existing models, and compare the predictive accuracy of the rate law with our experimentally

  7. Aggressive behavior: an alternative model of resting heart rate and sensation seeking.

    Science.gov (United States)

    Wilson, Laura C; Scarpa, Angela

    2014-01-01

    Low resting heart rate is a well-replicated biological correlate of aggression, and sensation seeking is frequently cited as the underlying causal explanation. However, little empirical evidence supports this mediating relationship. Furthermore, the biosocial model of violence and social push theory suggest sensation seeking may moderate the relationship between heart rate and aggression. In a sample of 128 college students (82.0% White; 73.4% female), the current study tested a moderation model as an alternative relationship between resting heart rate and sensation seeking in regard to aggression. Overall, the findings partially supported an interaction effect, whereby the relationship between heart rate and aggression was moderated by sensation seeking. Specifically, the oft-noted relationship between low resting heart rate and increased aggression was found, but only for individuals with low levels of sensation seeking. If replication supports this finding, the results may better inform prevention and intervention work. © 2013 Wiley Periodicals, Inc.

  8. Derivation of the chemical-equilibrium rate coefficient using scattering theory

    Science.gov (United States)

    Mickens, R. E.

    1977-01-01

    Scattering theory is applied to derive the equilibrium rate coefficient for a general homogeneous chemical reaction involving ideal gases. The reaction rate is expressed in terms of the product of a number of normalized momentum distribution functions, the product of the number of molecules with a given internal energy state, and the spin-averaged T-matrix elements. An expression for momentum distribution at equilibrium for an arbitrary molecule is presented, and the number of molecules with a given internal-energy state is represented by an expression which includes the partition function.

  9. MODELS AND THE DYNAMICS OF THEORIES

    Directory of Open Access Journals (Sweden)

    Paulo Abrantes

    2007-12-01

    Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.

  10. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....

  11. Topological quantum theories and integrable models

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.

    1991-01-01

    The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit

  12. Polyacetylene and relativistic field-theory models

    International Nuclear Information System (INIS)

    Bishop, A.R.; Campbell, D.K.; Fesser, K.

    1981-01-01

    Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed

  13. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  14. Physically based multiscale-viscoplastic model for metals and steel alloys: Theory and computation

    Science.gov (United States)

    Abed, Farid H.

    The main requirement of large deformation problems such as high-speed machining, impact, and various primarily metal forming, is to develop constitutive relations which are widely applicable and capable of accounting for complex paths of deformation. Achieving such desirable goals for material like metals and steel alloys involves a comprehensive study of their microstructures and experimental observations under different loading conditions. In general, metal structures display a strong rate- and temperature-dependence when deformed non-uniformly into the inelastic range. This effect has important implications for an increasing number of applications in structural and engineering mechanics. The mechanical behavior of these applications cannot be characterized by classical (rate-independent) continuum theories because they incorporate no 'material length scales'. It is therefore necessary to develop a rate-dependent (viscoplasticity) continuum theory bridging the gap between the classical continuum theories and the microstructure simulations. Physically based vicoplasticity models for different types of metals (body centered cubic, face centered cubic and hexagonal close-packed) and steel alloys are derived in this work for this purpose. We adopt a multi-scale, hierarchical thermodynamic consistent framework to construct the material constitutive relations for the rate-dependent behavior. The concept of thermal activation energy, dislocations interactions mechanisms and the role of dislocations dynamics in crystals are used in the derivation process taking into consideration the contribution of the plastic strain evolution of dislocation density to the flow stress of polycrystalline metals. Material length scales are implicitly introduced into the governing equations through material rate-dependency (viscosity). The proposed framework is implemented into the commercially well-known finite element software ABAQUS. The finite element simulations of material

  15. Relaxed Poisson cure rate models.

    Science.gov (United States)

    Rodrigues, Josemar; Cordeiro, Gauss M; Cancho, Vicente G; Balakrishnan, N

    2016-03-01

    The purpose of this article is to make the standard promotion cure rate model (Yakovlev and Tsodikov, ) more flexible by assuming that the number of lesions or altered cells after a treatment follows a fractional Poisson distribution (Laskin, ). It is proved that the well-known Mittag-Leffler relaxation function (Berberan-Santos, ) is a simple way to obtain a new cure rate model that is a compromise between the promotion and geometric cure rate models allowing for superdispersion. So, the relaxed cure rate model developed here can be considered as a natural and less restrictive extension of the popular Poisson cure rate model at the cost of an additional parameter, but a competitor to negative-binomial cure rate models (Rodrigues et al., ). Some mathematical properties of a proper relaxed Poisson density are explored. A simulation study and an illustration of the proposed cure rate model from the Bayesian point of view are finally presented. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Tissue Acoustoelectric Effect Modeling From Solid Mechanics Theory.

    Science.gov (United States)

    Song, Xizi; Qin, Yexian; Xu, Yanbin; Ingram, Pier; Witte, Russell S; Dong, Feng

    2017-10-01

    The acoustoelectric (AE) effect is a basic physical phenomenon, which underlies the changes made in the conductivity of a medium by the application of focused ultrasound. Recently, based on the AE effect, several biomedical imaging techniques have been widely studied, such as ultrasound-modulated electrical impedance tomography and ultrasound current source density imaging. To further investigate the mechanism of the AE effect in tissue and to provide guidance for such techniques, we have modeled the tissue AE effect using the theory of solid mechanics. Both bulk compression and thermal expansion of tissue are considered and discussed. Computation simulation shows that the muscle AE effect result, conductivity change rate, is 3.26×10 -3 with 4.3-MPa peak pressure, satisfying the theoretical value. Bulk compression plays the main role for muscle AE effect, while thermal expansion makes almost no contribution to it. In addition, the AE signals of porcine muscle are measured at different focal positions. With the same magnitude order and the same change trend, the experiment result confirms that the simulation result is effective. Both simulation and experimental results validate that tissue AE effect modeling using solid mechanics theory is feasible, which is of significance for the further development of related biomedical imaging techniques.

  17. Using Omega and NIF to Advance Theories of High-Pressure, High-Strain-Rate Tantalum Plastic Flow

    Science.gov (United States)

    Rudd, R. E.; Arsenlis, A.; Barton, N. R.; Cavallo, R. M.; Huntington, C. M.; McNaney, J. M.; Orlikowski, D. A.; Park, H.-S.; Prisbrey, S. T.; Remington, B. A.; Wehrenberg, C. E.

    2015-11-01

    Precisely controlled plasmas are playing an important role as both pump and probe in experiments to understand the strength of solid metals at high energy density (HED) conditions. In concert with theory, these experiments have enabled a predictive capability to model material strength at Mbar pressures and high strain rates. Here we describe multiscale strength models developed for tantalum and vanadium starting with atomic bonding and extending up through the mobility of individual dislocations, the evolution of dislocation networks and so on up to full scale. High-energy laser platforms such as the NIF and the Omega laser probe ramp-compressed strength to 1-5 Mbar. The predictions of the multiscale model agree well with the 1 Mbar experiments without tuning. The combination of experiment and theory has shown that solid metals can behave significantly differently at HED conditions; for example, the familiar strengthening of metals as the grain size is reduced has been shown not to occur in the high pressure experiments. Work performed under the auspices of the U.S. Dept. of Energy by Lawrence Livermore National Lab under contract DE-AC52-07NA273.

  18. Rate theory of solvent exchange and kinetics of Li+ − BF4−/PF6− ion pairs in acetonitrile

    International Nuclear Information System (INIS)

    Dang, Liem X.; Chang, Tsun-Mei

    2016-01-01

    In this paper, we describe our efforts to apply rate theories in studies of solvent exchange around Li + and the kinetics of ion pairings in lithium-ion batteries (LIBs). We report one of the first computer simulations of the exchange dynamics around solvated Li + in acetonitrile (ACN), which is a common solvent used in LIBs. We also provide details of the ion-pairing kinetics of Li + -[BF 4 ] and Li + -[PF 6 ] in ACN. Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine the ACN exchange process between the first and second solvation shells around Li + . We calculate exchange rates using transition state theory and weighted them with the transmission coefficients determined by the reactive flux, Impey, Madden, and McDonald approaches, and Grote-Hynes theory. We found the relaxation times changed from 180 ps to 4600 ps and from 30 ps to 280 ps for Li + -[BF 4 ] and Li + -[PF 6 ] ion pairs, respectively. These results confirm that the solvent response to the kinetics of ion pairing is significant. Our results also show that, in addition to affecting the free energy of solvation into ACN, the anion type also should significantly influence the kinetics of ion pairing. These results will increase our understanding of the thermodynamic and kinetic properties of LIB systems.

  19. Modeling Atmospheric Turbulence via Rapid Distortion Theory: Spectral Tensor of Velocity and Buoyancy

    DEFF Research Database (Denmark)

    Chougule, Abhijit S.; Mann, Jakob; Kelly, Mark C.

    2017-01-01

    A spectral tensor model is presented for turbulent fluctuations of wind velocity components and temperature, assuming uniform vertical gradients in mean temperature and mean wind speed. The model is built upon rapid distortion theory (RDT) following studies by Mann and by Hanazaki and Hunt, using...... the eddy lifetime parameterization of Mann to make the model stationary. The buoyant spectral tensor model is driven via five parameters: the viscous dissipation rate epsilon, length scale of energy-containing eddies L, a turbulence anisotropy parameter Gamma, gradient Richardson number (Ri) representing...

  20. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  1. Security-Based Mechanism for Proactive Routing Schema Using Game Theory Model

    Directory of Open Access Journals (Sweden)

    Hicham Amraoui

    2016-01-01

    Full Text Available Game theory may offer a useful mechanism to address many problems in mobile ad hoc networks (MANETs. One of the key concepts in the research field of such networks with Optimized Link State Routing Protocol (OLSR is the security problem. Relying on applying game theory to study this problem, we consider two strategies during this suggested model: cooperate and not-cooperate. However, in such networks, it is not easy to identify different actions of players. In this paper, we have essentially been inspired from recent advances provided in game theory to propose a new model for security in MANETs. Our proposal presents a powerful tool with a large number of players where interactions are played multiple times. Moreover, each node keeps a cooperation rate (CR record of other nodes to cope with the behaviors and mitigate aggregate effect of other malicious devices. Additionally, our suggested security mechanism does not only take into consideration security requirements, but also take into account system resources and network performances. The simulation results using Network Simulator 3 are presented to illustrate the effectiveness of the proposal.

  2. Universal Rate Model Selector: A Method to Quickly Find the Best-Fit Kinetic Rate Model for an Experimental Rate Profile

    Science.gov (United States)

    2017-08-01

    k2 – k1) 3.3 Universal Kinetic Rate Platform Development Kinetic rate models range from pure chemical reactions to mass transfer...14 8. The rate model that best fits the experimental data is a first-order or homogeneous catalytic reaction ...Avrami (7), and intraparticle diffusion (6) rate equations to name a few. A single fitting algorithm (kinetic rate model ) for a reaction does not

  3. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its stru....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its......, it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well...

  4. Toda theories, W-algebras, and minimal models

    International Nuclear Information System (INIS)

    Mansfield, P.; Spence, B.

    1991-01-01

    We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)

  5. A model for hot electron phenomena: Theory and general results

    International Nuclear Information System (INIS)

    Carrillo, J.L.; Rodriquez, M.A.

    1988-10-01

    We propose a model for the description of the hot electron phenomena in semiconductors. Based on this model we are able to reproduce accurately the main characteristics observed in experiments of electric field transport, optical absorption, steady state photoluminescence and relaxation process. Our theory does not contain free nor adjustable parameters, it is very fast computerwise, and incorporates the main collision mechanisms including screening and phonon heating effects. Our description on a set of nonlinear rate equations in which the interactions are represented by coupling coefficients or effective frequencies. We calculate three coefficients from the characteristic constants and the band structure of the material. (author). 22 refs, 5 figs, 1 tab

  6. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  7. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    Science.gov (United States)

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  8. Contribution to the study of conformal theories and integrable models

    International Nuclear Information System (INIS)

    Sochen, N.

    1992-05-01

    The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved

  9. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  10. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai

    2014-05-01

    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  11. Staircase Models from Affine Toda Field Theory

    CERN Document Server

    Dorey, P; Dorey, Patrick; Ravanini, Francesco

    1993-01-01

    We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.

  12. Lapse rate modeling

    DEFF Research Database (Denmark)

    De Giovanni, Domenico

    2010-01-01

    prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...

  13. Lapse Rate Modeling

    DEFF Research Database (Denmark)

    De Giovanni, Domenico

    prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...

  14. Irreducible integrable theories form tensor products of conformal models

    International Nuclear Information System (INIS)

    Mathur, S.D.; Warner, N.P.

    1991-01-01

    By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)

  15. A Theory of Immersion Freezing

    Science.gov (United States)

    Barahona, Donifan

    2017-01-01

    Immersion freezing is likely involved in the initiation of precipitation and determines to large extent the phase partitioning in convective clouds. Theoretical models commonly used to describe immersion freezing in atmospheric models are based on the classical nucleation theory which however neglects important interactions near the immersed particle that may affect nucleation rates. This work introduces a new theory of immersion freezing based on two premises. First, immersion ice nucleation is mediated by the modification of the properties of water near the particle-liquid interface, rather than by the geometry of the ice germ. Second, the same mechanism that leads to the decrease in the work of germ formation also decreases the mobility of water molecules near the immersed particle. These two premises allow establishing general thermodynamic constraints to the ice nucleation rate. Analysis of the new theory shows that active sites likely trigger ice nucleation, but they do not control the overall nucleation rate nor the probability of freezing. It also suggests that materials with different ice nucleation efficiency may exhibit similar freezing temperatures under similar conditions but differ in their sensitivity to particle surface area and cooling rate. Predicted nucleation rates show good agreement with observations for a diverse set of materials including dust, black carbon and bacterial ice nucleating particles. The application of the new theory within the NASA Global Earth System Model (GEOS-5) is also discussed.

  16. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto

    2008-01-01

    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  17. Modelling plastic deformation of metals over a wide range of strain rates using irreversible thermodynamics

    International Nuclear Information System (INIS)

    Huang Mingxin; Rivera-Diaz-del-Castillo, Pedro E J; Zwaag, Sybrand van der; Bouaziz, Olivier

    2009-01-01

    Based on the theory of irreversible thermodynamics, the present work proposes a dislocation-based model to describe the plastic deformation of FCC metals over wide ranges of strain rates. The stress-strain behaviour and the evolution of the average dislocation density are derived. It is found that there is a transitional strain rate (∼ 10 4 s -1 ) over which the phonon drag effects appear, resulting in a significant increase in the flow stress and the average dislocation density. The model is applied to pure Cu deformed at room temperature and at strain rates ranging from 10 -5 to 10 6 s -1 showing good agreement with experimental results.

  18. Models and theories of prescribing decisions: A review and suggested a new model.

    Science.gov (United States)

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  19. Demographic trade-offs in a neutral model explain death-rate--abundance-rank relationship.

    Science.gov (United States)

    Lin, Kui; Zhang, Da-Yong; He, Fangliang

    2009-01-01

    The neutral theory of biodiversity has been criticized for its neglect of species differences. Yet it is much less heeded that S. P. Hubbell's definition of neutrality allows species to differ in their birth and death rates as long as they have an equal per capita fitness. Using the lottery model of competition we find that fitness equalization through birth-death trade-offs can make species coexist longer than expected for demographically identical species, whereas the probability of monodominance for a species under zero-sum neutral dynamics is equal to its initial relative abundance. Furthermore, if newly arising species in a community survive preferentially they are more likely to slip through the quagmire of rareness, thus creating a strong selective bias favoring their community membership. On the other hand, high-mortality species, once having gained a footing in the community, are more likely to become abundant due to their compensatory high birth rates. This unexpected result explains why a positive association between species abundance and per capita death rate can be seen in tropical-forest communities. An explicit incorporation of interspecific trade-offs between birth and death into the neutral theory increases the theory's realism as well as its predictive power.

  20. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  1. Development of a dynamic computational model of social cognitive theory.

    Science.gov (United States)

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  2. Item response theory and structural equation modelling for ordinal data: Describing the relationship between KIDSCREEN and Life-H.

    Science.gov (United States)

    Titman, Andrew C; Lancaster, Gillian A; Colver, Allan F

    2016-10-01

    Both item response theory and structural equation models are useful in the analysis of ordered categorical responses from health assessment questionnaires. We highlight the advantages and disadvantages of the item response theory and structural equation modelling approaches to modelling ordinal data, from within a community health setting. Using data from the SPARCLE project focussing on children with cerebral palsy, this paper investigates the relationship between two ordinal rating scales, the KIDSCREEN, which measures quality-of-life, and Life-H, which measures participation. Practical issues relating to fitting models, such as non-positive definite observed or fitted correlation matrices, and approaches to assessing model fit are discussed. item response theory models allow properties such as the conditional independence of particular domains of a measurement instrument to be assessed. When, as with the SPARCLE data, the latent traits are multidimensional, structural equation models generally provide a much more convenient modelling framework. © The Author(s) 2013.

  3. Theory and modeling group

    Science.gov (United States)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  4. Explaining transgression in respiratory rate observation methods in the emergency department: A classic grounded theory analysis.

    Science.gov (United States)

    Flenady, Tracy; Dwyer, Trudy; Applegarth, Judith

    2017-09-01

    Abnormal respiratory rates are one of the first indicators of clinical deterioration in emergency department(ED) patients. Despite the importance of respiratory rate observations, this vital sign is often inaccurately recorded on ED observation charts, compromising patient safety. Concurrently, there is a paucity of research reporting why this phenomenon occurs. To develop a substantive theory explaining ED registered nurses' reasoning when they miss or misreport respiratory rate observations. This research project employed a classic grounded theory analysis of qualitative data. Seventy-nine registered nurses currently working in EDs within Australia. Data collected included detailed responses from individual interviews and open-ended responses from an online questionnaire. Classic grounded theory (CGT) research methods were utilised, therefore coding was central to the abstraction of data and its reintegration as theory. Constant comparison synonymous with CGT methods were employed to code data. This approach facilitated the identification of the main concern of the participants and aided in the generation of theory explaining how the participants processed this issue. The main concern identified is that ED registered nurses do not believe that collecting an accurate respiratory rate for ALL patients at EVERY round of observations is a requirement, and yet organizational requirements often dictate that a value for the respiratory rate be included each time vital signs are collected. The theory 'Rationalising Transgression', explains how participants continually resolve this problem. The study found that despite feeling professionally conflicted, nurses often erroneously record respiratory rate observations, and then rationalise this behaviour by employing strategies that adjust the significance of the organisational requirement. These strategies include; Compensating, when nurses believe they are compensating for errant behaviour by enhancing the patient's outcome

  5. Rating Movies and Rating the Raters Who Rate Them.

    Science.gov (United States)

    Zhou, Hua; Lange, Kenneth

    2009-11-01

    The movie distribution company Netflix has generated considerable buzz in the statistics community by offering a million dollar prize for improvements to its movie rating system. Among the statisticians and computer scientists who have disclosed their techniques, the emphasis has been on machine learning approaches. This article has the modest goal of discussing a simple model for movie rating and other forms of democratic rating. Because the model involves a large number of parameters, it is nontrivial to carry out maximum likelihood estimation. Here we derive a straightforward EM algorithm from the perspective of the more general MM algorithm. The algorithm is capable of finding the global maximum on a likelihood landscape littered with inferior modes. We apply two variants of the model to a dataset from the MovieLens archive and compare their results. Our model identifies quirky raters, redefines the raw rankings, and permits imputation of missing ratings. The model is intended to stimulate discussion and development of better theory rather than to win the prize. It has the added benefit of introducing readers to some of the issues connected with analyzing high-dimensional data.

  6. A 'theory of everything'? [Extending the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1993-01-01

    The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)

  7. Introduction to gauge theories and the Standard Model

    CERN Document Server

    de Wit, Bernard

    1995-01-01

    The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.

  8. Variational transition state theory

    International Nuclear Information System (INIS)

    Truhlar, D.G.

    1986-01-01

    This project is concerned with the development and applications of generalized transition state theory and multidimensional tunneling approximations to chemical reaction rates. They have developed and implemented several practical versions of variational transition state theory (VTST), namely canonical variational theory (CVT), improved canonical variational theory (ICVT), and microcanonical variational theory (μVT). They have also developed and implemented several accurate multidimensional semiclassical tunneling approximations, the most accurate of which are the small-curvature semiclassical adiabatic (SCSA), large-curvature version-3 (LC3), and least-action (LA) approximations. They have applied the methods to thermal rate constants, using transmission coefficients based on ground-state tunneling, and they have also presented and applied adiabatic and diabatic extensions to calculated rate constants for vibrationally excited reactants. Their general goal is to develop accurate methods for calculating chemical reaction rate constants that remain practical even for reasonably complicated molecules. The approximations mentioned above yield rate constants for systems whose potential energy surface is known or assumed. Thus a second, equally important aspect of their work is the determination or modeling, semi-empirically and/or from electronic structure calculations, of potential energy surfaces

  9. Mean field theory for a balanced hypercolumn model of orientation selectivity in primary visual cortex

    DEFF Research Database (Denmark)

    Lerchner, Alexander; Sterner, G.; Hertz, J.

    2006-01-01

    We present a complete mean field theory for a balanced state of a simple model of an orientation hypercolumn, with a numerical procedure for solving the mean-field equations quantitatively. With our treatment, one can determine self-consistently both the firing rates and the firing correlations...

  10. Effective field theory and the quark model

    International Nuclear Information System (INIS)

    Durand, Loyal; Ha, Phuoc; Jaczko, Gregory

    2001-01-01

    We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections

  11. An Application of Durkheim's Theory of Suicide to Prison Suicide Rates in the United States

    Science.gov (United States)

    Tartaro, Christine; Lester, David

    2005-01-01

    E. Durkheim (1897) suggested that the societal rate of suicide might be explained by societal factors, such as marriage, divorce, and birth rates. The current study examined male prison suicide rates and suicide rates for men in the total population in the United States and found that variables based on Durkheim's theory of suicide explained…

  12. Minisuperspace models in histories theory

    International Nuclear Information System (INIS)

    Anastopoulos, Charis; Savvidou, Ntina

    2005-01-01

    We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context

  13. Quantum Link Models and Quantum Simulation of Gauge Theories

    International Nuclear Information System (INIS)

    Wiese, U.J.

    2015-01-01

    This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)

  14. Matrix model as a mirror of Chern-Simons theory

    International Nuclear Information System (INIS)

    Aganagic, Mina; Klemm, Albrecht; Marino, Marcos; Vafa, Cumrun

    2004-01-01

    Using mirror symmetry, we show that Chern-Simons theory on certain manifolds such as lens spaces reduces to a novel class of Hermitian matrix models, where the measure is that of unitary matrix models. We show that this agrees with the more conventional canonical quantization of Chern-Simons theory. Moreover, large N dualities in this context lead to computation of all genus A-model topological amplitudes on toric Calabi-Yau manifolds in terms of matrix integrals. In the context of type IIA superstring compactifications on these Calabi-Yau manifolds with wrapped D6 branes (which are dual to M-theory on G2 manifolds) this leads to engineering and solving F-terms for N=1 supersymmetric gauge theories with superpotentials involving certain multi-trace operators. (author)

  15. A QCD Model Using Generalized Yang-Mills Theory

    International Nuclear Information System (INIS)

    Wang Dianfu; Song Heshan; Kou Lina

    2007-01-01

    Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.

  16. A new lattice hydrodynamic model based on control method considering the flux change rate and delay feedback signal

    Science.gov (United States)

    Qin, Shunda; Ge, Hongxia; Cheng, Rongjun

    2018-02-01

    In this paper, a new lattice hydrodynamic model is proposed by taking delay feedback and flux change rate effect into account in a single lane. The linear stability condition of the new model is derived by control theory. By using the nonlinear analysis method, the mKDV equation near the critical point is deduced to describe the traffic congestion. Numerical simulations are carried out to demonstrate the advantage of the new model in suppressing traffic jam with the consideration of flux change rate effect in delay feedback model.

  17. Item level diagnostics and model - data fit in item response theory ...

    African Journals Online (AJOL)

    Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...

  18. Efficiency and credit ratings: a permutation-information-theory analysis

    International Nuclear Information System (INIS)

    Bariviera, Aurelio Fernandez; Martinez, Lisana B; Zunino, Luciano; Belén Guercio, M; Rosso, Osvaldo A

    2013-01-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity–entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification. (paper)

  19. Li+ solvation and kinetics of Li+-BF4-/PF6- ion pairs in ethylene carbonate. A molecular dynamics study with classical rate theories

    Science.gov (United States)

    Chang, Tsun-Mei; Dang, Liem X.

    2017-10-01

    Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine the ethylene carbonate (EC) exchange process between the first and second solvation shells around Li+ and the dissociation kinetics of ion pairs Li+-[BF4] and Li+-[PF6] in this solvent. We calculate the exchange rates using transition state theory and correct them with transmission coefficients computed by the reactive flux, Impey, Madden, and McDonald approaches, and Grote-Hynes theory. We found that the residence times of EC around Li+ ions varied from 60 to 450 ps, depending on the correction method used. We found that the relaxation times changed significantly from Li+-[BF4] to Li+-[PF6] ion pairs in EC. Our results also show that, in addition to affecting the free energy of dissociation in EC, the anion type also significantly influences the dissociation kinetics of ion pairing.

  20. USE OF ROUGH SETS AND SPECTRAL DATA FOR BUILDING PREDICTIVE MODELS OF REACTION RATE CONSTANTS

    Science.gov (United States)

    A model for predicting the log of the rate constants for alkaline hydrolysis of organic esters has been developed with the use of gas-phase min-infrared library spectra and a rule-building software system based on the mathematical theory of rough sets. A diverse set of 41 esters ...

  1. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....

  2. Reconstructing bidimensional scalar field theory models

    International Nuclear Information System (INIS)

    Flores, Gabriel H.; Svaiter, N.F.

    2001-07-01

    In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)

  3. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Science.gov (United States)

    Oizumi, Ryo; Kuniya, Toshikazu; Enatsu, Yoichi

    2016-01-01

    Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  4. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Directory of Open Access Journals (Sweden)

    Ryo Oizumi

    Full Text Available Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  5. Revisiting a model of ontogenetic growth: estimating model parameters from theory and data.

    Science.gov (United States)

    Moses, Melanie E; Hou, Chen; Woodruff, William H; West, Geoffrey B; Nekola, Jeffery C; Zuo, Wenyun; Brown, James H

    2008-05-01

    The ontogenetic growth model (OGM) of West et al. provides a general description of how metabolic energy is allocated between production of new biomass and maintenance of existing biomass during ontogeny. Here, we reexamine the OGM, make some minor modifications and corrections, and further evaluate its ability to account for empirical variation on rates of metabolism and biomass in vertebrates both during ontogeny and across species of varying adult body size. We show that the updated version of the model is internally consistent and is consistent with other predictions of metabolic scaling theory and empirical data. The OGM predicts not only the near universal sigmoidal form of growth curves but also the M(1/4) scaling of the characteristic times of ontogenetic stages in addition to the curvilinear decline in growth efficiency described by Brody. Additionally, the OGM relates the M(3/4) scaling across adults of different species to the scaling of metabolic rate across ontogeny within species. In providing a simple, quantitative description of how energy is allocated to growth, the OGM calls attention to unexplained variation, unanswered questions, and opportunities for future research.

  6. Benchmark calculations of thermal reaction rates. I - Quantal scattering theory

    Science.gov (United States)

    Chatfield, David C.; Truhlar, Donald G.; Schwenke, David W.

    1991-01-01

    The thermal rate coefficient for the prototype reaction H + H2 yields H2 + H with zero total angular momentum is calculated by summing, averaging, and numerically integrating state-to-state reaction probabilities calculated by time-independent quantum-mechanical scattering theory. The results are very carefully converged with respect to all numerical parameters in order to provide high-precision benchmark results for confirming the accuracy of new methods and testing their efficiency.

  7. Students' motivational processes and their relationship to teacher ratings in school physical education: a self-determination theory approach.

    Science.gov (United States)

    Standage, Martyn; Duda, Joan L; Ntoumanis, Nikos

    2006-03-01

    In the present study, we used a model of motivation grounded in self-determination theory (Deci & Ryan, 1985, 1991; Ryan & Deci, 2000a, 2000b, 2002) to examine the relationship between physical education (PE) students' motivational processes and ratings of their effort and persistence as provided by their PE teacher. Data were obtained from 394 British secondary school students (204 boys, 189 girls, 1 gender not specified; M age = 11.97 years; SD = .89; range = 11-14 years) who responded to a multisection inventory (tapping autonomy-support, autonomy, competence, relatedness, and self-determined motivation). The students' respective PE teachers subsequently provided ratings reflecting the effort and persistence each student exhibited in their PE classes. The hypothesized relationships among the study variables were examined via structural equation modeling analysis using latent factors. Results of maximum likelihood analysis using the bootstrapping method revealed the proposed model demonstrated a good fit to the data, chi-squared (292) = 632.68, p self-determination. Student-reported levels of self-determined motivation positively predicted teacher ratings of effort and persistence in PE. The findings are discussed with regard to enhancing student motivation in PE settings.

  8. Two-matrix models and c =1 string theory

    International Nuclear Information System (INIS)

    Bonora, L.; Xiong Chuansheng

    1994-05-01

    We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)

  9. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  10. A cluster randomized theory-guided oral hygiene trial in adolescents-A latent growth model.

    Science.gov (United States)

    Aleksejūnienė, J; Brukienė, V

    2018-05-01

    (i) To test whether theory-guided interventions are more effective than conventional dental instruction (CDI) for changing oral hygiene in adolescents and (ii) to examine whether such interventions equally benefit both genders and different socio-economic (SES) groups. A total of 244 adolescents were recruited from three schools, and cluster randomization allocated adolescents to one of the three types of interventions: two were theory-based interventions (Precaution Adoption Process Model or Authoritative Parenting Model) and CDI served as an active control. Oral hygiene levels % (OH) were assessed at baseline, after 3 months and after 12 months. A complete data set was available for 166 adolescents (the total follow-up rate: 69%). There were no significant differences in baseline OH between those who participated throughout the study and those who dropped out. Bivariate and multivariate analyses showed that theory-guided interventions produced significant improvements in oral hygiene and that there were no significant gender or socio-economic differences. Theory-guided interventions produced more positive changes in OH than CDI, and these changes did not differ between gender and SES groups. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory

    International Nuclear Information System (INIS)

    Chung, S.; Tye, S.H.

    1993-01-01

    The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory

  12. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Sissay, Adonay [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J. [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Lopata, Kenneth, E-mail: klopata@lsu.edu [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)

    2016-09-07

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  13. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    International Nuclear Information System (INIS)

    Sissay, Adonay; Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J.; Lopata, Kenneth

    2016-01-01

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  14. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  15. Crisis in Context Theory: An Ecological Model

    Science.gov (United States)

    Myer, Rick A.; Moore, Holly B.

    2006-01-01

    This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…

  16. Using NIF to Test Theories of High-Pressure, High-Rate Plastic Flow in Metals

    Science.gov (United States)

    Rudd, Robert E.; Arsenlis, A.; Cavallo, R. M.; Huntington, C. M.; McNaney, J. M.; Park, H. S.; Powell, P.; Prisbrey, S. T.; Remington, B. A.; Swift, D.; Wehrenberg, C. E.; Yang, L.

    2017-10-01

    Precisely controlled plasmas are playing key roles both as pump and probe in experiments to understand the strength of solid metals at high energy density (HED) conditions. In concert with theoretical advances, these experiments have enabled a predictive capability to model material strength at Mbar pressures and high strain rates. Here we describe multiscale strength models developed for tantalum starting with atomic bonding and extending up through the mobility of individual dislocations, the evolution of dislocation networks and so on until the ultimate material response at the scale of an experiment. Experiments at the National Ignition Facility (NIF) probe strength in metals ramp compressed to 1-8 Mbar. The model is able to predict 1 Mbar experiments without adjustable parameters. The combination of experiment and theory has shown that solid metals can behave significantly differently at HED conditions. We also describe recent studies of lead compressed to 3-5 Mbar. Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA273.

  17. Chern-Simons Theory, Matrix Models, and Topological Strings

    International Nuclear Information System (INIS)

    Walcher, J

    2006-01-01

    This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may

  18. The SMART Theory and Modeling Team: An Integrated Element of Mission Development and Science Analysis

    Science.gov (United States)

    Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.

    2005-01-01

    When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored

  19. Rate turnover in mechano-catalytic coupling: A model and its microscopic origin

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Mahua; Grazioli, Gianmarc; Andricioaei, Ioan, E-mail: andricio@uci.edu [Department of Chemistry, University of California, Irvine, California 92697 (United States)

    2015-07-28

    A novel aspect in the area of mechano-chemistry concerns the effect of external forces on enzyme activity, i.e., the existence of mechano-catalytic coupling. Recent experiments on enzyme-catalyzed disulphide bond reduction in proteins under the effect of a force applied on the termini of the protein substrate reveal an unexpected biphasic force dependence for the bond cleavage rate. Here, using atomistic molecular dynamics simulations combined with Smoluchowski theory, we propose a model for this behavior. For a broad range of forces and systems, the model reproduces the experimentally observed rates by solving a reaction-diffusion equation for a “protein coordinate” diffusing in a force-dependent effective potential. The atomistic simulations are used to compute, from first principles, the parameters of the model via a quasiharmonic analysis. Additionally, the simulations are also used to provide details about the microscopic degrees of freedom that are important for the underlying mechano-catalysis.

  20. Rate turnover in mechano-catalytic coupling: A model and its microscopic origin

    International Nuclear Information System (INIS)

    Roy, Mahua; Grazioli, Gianmarc; Andricioaei, Ioan

    2015-01-01

    A novel aspect in the area of mechano-chemistry concerns the effect of external forces on enzyme activity, i.e., the existence of mechano-catalytic coupling. Recent experiments on enzyme-catalyzed disulphide bond reduction in proteins under the effect of a force applied on the termini of the protein substrate reveal an unexpected biphasic force dependence for the bond cleavage rate. Here, using atomistic molecular dynamics simulations combined with Smoluchowski theory, we propose a model for this behavior. For a broad range of forces and systems, the model reproduces the experimentally observed rates by solving a reaction-diffusion equation for a “protein coordinate” diffusing in a force-dependent effective potential. The atomistic simulations are used to compute, from first principles, the parameters of the model via a quasiharmonic analysis. Additionally, the simulations are also used to provide details about the microscopic degrees of freedom that are important for the underlying mechano-catalysis

  1. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  2. A mathematical framework for yield (vs. rate) optimization in constraint-based modeling and applications in metabolic engineering.

    Science.gov (United States)

    Klamt, Steffen; Müller, Stefan; Regensburger, Georg; Zanghellini, Jürgen

    2018-02-07

    The optimization of metabolic rates (as linear objective functions) represents the methodical core of flux-balance analysis techniques which have become a standard tool for the study of genome-scale metabolic models. Besides (growth and synthesis) rates, metabolic yields are key parameters for the characterization of biochemical transformation processes, especially in the context of biotechnological applications. However, yields are ratios of rates, and hence the optimization of yields (as nonlinear objective functions) under arbitrary linear constraints is not possible with current flux-balance analysis techniques. Despite the fundamental importance of yields in constraint-based modeling, a comprehensive mathematical framework for yield optimization is still missing. We present a mathematical theory that allows one to systematically compute and analyze yield-optimal solutions of metabolic models under arbitrary linear constraints. In particular, we formulate yield optimization as a linear-fractional program. For practical computations, we transform the linear-fractional yield optimization problem to a (higher-dimensional) linear problem. Its solutions determine the solutions of the original problem and can be used to predict yield-optimal flux distributions in genome-scale metabolic models. For the theoretical analysis, we consider the linear-fractional problem directly. Most importantly, we show that the yield-optimal solution set (like the rate-optimal solution set) is determined by (yield-optimal) elementary flux vectors of the underlying metabolic model. However, yield- and rate-optimal solutions may differ from each other, and hence optimal (biomass or product) yields are not necessarily obtained at solutions with optimal (growth or synthesis) rates. Moreover, we discuss phase planes/production envelopes and yield spaces, in particular, we prove that yield spaces are convex and provide algorithms for their computation. We illustrate our findings by a small

  3. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....

  4. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  5. Origins of Discrepancies Between Kinetic Rate Law Theory and Experiments in the Na2O-B2O3-SiO2 System

    International Nuclear Information System (INIS)

    McGrail, B. PETER; Icenhower, Jonathan P.; Rodriguez, Elsa A.; McGrail, B.P.; Cragnolino, G.A.

    2002-01-01

    Discrepancies between classical kinetic rate law theory and experiment were quantitatively assessed and found to correlate with macromolecular amorphous separation in the sodium borosilicate glass system. A quantitative reinterpretation of static corrosion data and new SPFT data shows that a recently advanced protective surface layer theory fails to describe the observed dissolution behavior of simple and complex silicate glasses under carefully controlled experimental conditions. The hypothesis is shown to be self-inconsistent in contrast with a phase separation model that is in quantitative agreement with experiments

  6. Inflation Rate Modelling in Indonesia

    Directory of Open Access Journals (Sweden)

    Rezzy Eko Caraka

    2016-10-01

    Full Text Available The purposes of this research were to analyse: (i Modelling the inflation rate in Indonesia with parametric regression. (ii Modelling the inflation rate in Indonesia using non-parametric regression spline multivariable (iii Determining the best model the inflation rate in Indonesia (iv Explaining the relationship inflation model parametric and non-parametric regression spline multivariable. Based on the analysis using the two methods mentioned the coefficient of determination (R2 in parametric regression of 65.1% while non-parametric amounted to 99.39%. To begin with, the factor of money supply or money stock, crude oil prices and the rupiah exchange rate against the dollar is significant on the rate of inflation. The stability of inflation is essential to support sustainable economic development and improve people's welfare. In conclusion, unstable inflation will complicate business planning business activities, both in production and investment activities as well as in the pricing of goods and services produced.DOI: 10.15408/etk.v15i2.3260

  7. Predictive transport modelling of type I ELMy H-mode dynamics using a theory-motivated combined ballooning-peeling model

    International Nuclear Information System (INIS)

    Loennroth, J-S; Parail, V; Dnestrovskij, A; Figarella, C; Garbet, X; Wilson, H

    2004-01-01

    This paper discusses predictive transport simulations of the type I ELMy high confinement mode (H-mode) with a theory-motivated edge localized mode (ELM) model based on linear ballooning and peeling mode stability theory. In the model, a total mode amplitude is calculated as a sum of the individual mode amplitudes given by two separate linear differential equations for the ballooning and peeling mode amplitudes. The ballooning and peeling mode growth rates are represented by mutually analogous terms, which differ from zero upon the violation of a critical pressure gradient and an analytical peeling mode stability criterion, respectively. The damping of the modes due to non-ideal magnetohydrodynamic effects is controlled by a term driving the mode amplitude towards the level of background fluctuations. Coupled to simulations with the JETTO transport code, the model qualitatively reproduces the experimental dynamics of type I ELMy H-mode, including an ELM frequency that increases with the external heating power. The dynamics of individual ELM cycles is studied. Each ELM is usually triggered by a ballooning mode instability. The ballooning phase of the ELM reduces the pressure gradient enough to make the plasma peeling unstable, whereby the ELM continues driven by the peeling mode instability, until the edge current density has been depleted to a stable level. Simulations with current ramp-up and ramp-down are studied as examples of situations in which pure peeling and pure ballooning mode ELMs, respectively, can be obtained. The sensitivity with respect to the ballooning and peeling mode growth rates is investigated. Some consideration is also given to an alternative formulation of the model as well as to a pure peeling model

  8. Eye growth and myopia development: Unifying theory and Matlab model.

    Science.gov (United States)

    Hung, George K; Mahadas, Kausalendra; Mohammad, Faisal

    2016-03-01

    The aim of this article is to present an updated unifying theory of the mechanisms underlying eye growth and myopia development. A series of model simulation programs were developed to illustrate the mechanism of eye growth regulation and myopia development. Two fundamental processes are presumed to govern the relationship between physiological optics and eye growth: genetically pre-programmed signaling and blur feedback. Cornea/lens is considered to have only a genetically pre-programmed component, whereas eye growth is considered to have both a genetically pre-programmed and a blur feedback component. Moreover, based on the Incremental Retinal-Defocus Theory (IRDT), the rate of change of blur size provides the direction for blur-driven regulation. The various factors affecting eye growth are shown in 5 simulations: (1 - unregulated eye growth): blur feedback is rendered ineffective, as in the case of form deprivation, so there is only genetically pre-programmed eye growth, generally resulting in myopia; (2 - regulated eye growth): blur feedback regulation demonstrates the emmetropization process, with abnormally excessive or reduced eye growth leading to myopia and hyperopia, respectively; (3 - repeated near-far viewing): simulation of large-to-small change in blur size as seen in the accommodative stimulus/response function, and via IRDT as well as nearwork-induced transient myopia (NITM), leading to the development of myopia; (4 - neurochemical bulk flow and diffusion): release of dopamine from the inner plexiform layer of the retina, and the subsequent diffusion and relay of neurochemical cascade show that a decrease in dopamine results in a reduction of proteoglycan synthesis rate, which leads to myopia; (5 - Simulink model): model of genetically pre-programmed signaling and blur feedback components that allows for different input functions to simulate experimental manipulations that result in hyperopia, emmetropia, and myopia. These model simulation programs

  9. Constitutive law for seismicity rate based on rate and state friction: Dieterich 1994 revisited.

    Science.gov (United States)

    Heimisson, E. R.; Segall, P.

    2017-12-01

    Dieterich [1994] derived a constitutive law for seismicity rate based on rate and state friction, which has been applied widely to aftershocks, earthquake triggering, and induced seismicity in various geological settings. Here, this influential work is revisited, and re-derived in a more straightforward manner. By virtue of this new derivation the model is generalized to include changes in effective normal stress associated with background seismicity. Furthermore, the general case when seismicity rate is not constant under constant stressing rate is formulated. The new derivation provides directly practical integral expressions for the cumulative number of events and rate of seismicity for arbitrary stressing history. Arguably, the most prominent limitation of Dieterich's 1994 theory is the assumption that seismic sources do not interact. Here we derive a constitutive relationship that considers source interactions between sub-volumes of the crust, where the stress in each sub-volume is assumed constant. Interactions are considered both under constant stressing rate conditions and for arbitrary stressing history. This theory can be used to model seismicity rate due to stress changes or to estimate stress changes using observed seismicity from triggered earthquake swarms where earthquake interactions and magnitudes are take into account. We identify special conditions under which influence of interactions cancel and the predictions reduces to those of Dieterich 1994. This remarkable result may explain the apparent success of the model when applied to observations of triggered seismicity. This approach has application to understanding and modeling induced and triggered seismicity, and the quantitative interpretation of geodetic and seismic data. It enables simultaneous modeling of geodetic and seismic data in a self-consistent framework. To date physics-based modeling of seismicity with or without geodetic data has been found to give insight into various processes

  10. A Leadership Identity Development Model: Applications from a Grounded Theory

    Science.gov (United States)

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.

    2006-01-01

    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  11. A review of organizational buyer behaviour models and theories ...

    African Journals Online (AJOL)

    Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...

  12. Beta-decay rate and beta-delayed neutron emission probability of improved gross theory

    Science.gov (United States)

    Koura, Hiroyuki

    2014-09-01

    A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for unmeasured nuclei are adopted from the KTUY nuclear mass formula, which is based on the spherical-basis method. Considering the properties of the integrated Fermi function, we can roughly categorized energy region of excited-state of a daughter nucleus into three regions: a highly-excited energy region, which fully affect a delayed neutron probability, a middle energy region, which is estimated to contribute the decay heat, and a region neighboring the ground-state, which determines the beta-decay rate. Some results will be given in the presentation. A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for

  13. Theoretical and expert system approach to photoionization theories

    Directory of Open Access Journals (Sweden)

    Petrović Ivan D.

    2016-01-01

    Full Text Available The influence of the ponderomotive and the Stark shifts on the tunneling transition rate was observed, for non-relativistic linearly polarized laser field for alkali atoms, with three different theoretical models, the Keldysh theory, the Perelomov, Popov, Terent'ev (PPT theory, and the Ammosov, Delone, Krainov (ADK theory. We showed that aforementioned shifts affect the transition rate differently for different approaches. Finally, we presented a simple expert system for analysis of photoionization theories.

  14. Topos models for physics and topos theory

    International Nuclear Information System (INIS)

    Wolters, Sander

    2014-01-01

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos

  15. Finite Unification: Theory, Models and Predictions

    CERN Document Server

    Heinemeyer, S; Zoupanos, G

    2011-01-01

    All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...

  16. Developing and exploring a theory for the lateral erosion of bedrock channels for use in landscape evolution models

    Directory of Open Access Journals (Sweden)

    A. L. Langston

    2018-01-01

    Full Text Available Understanding how a bedrock river erodes its banks laterally is a frontier in geomorphology. Theories for the vertical incision of bedrock channels are widely implemented in the current generation of landscape evolution models. However, in general existing models do not seek to implement the lateral migration of bedrock channel walls. This is problematic, as modeling geomorphic processes such as terrace formation and hillslope–channel coupling depends on the accurate simulation of valley widening. We have developed and implemented a theory for the lateral migration of bedrock channel walls in a catchment-scale landscape evolution model. Two model formulations are presented, one representing the slow process of widening a bedrock canyon and the other representing undercutting, slumping, and rapid downstream sediment transport that occurs in softer bedrock. Model experiments were run with a range of values for bedrock erodibility and tendency towards transport- or detachment-limited behavior and varying magnitudes of sediment flux and water discharge in order to determine the role that each plays in the development of wide bedrock valleys. The results show that this simple, physics-based theory for the lateral erosion of bedrock channels produces bedrock valleys that are many times wider than the grid discretization scale. This theory for the lateral erosion of bedrock channel walls and the numerical implementation of the theory in a catchment-scale landscape evolution model is a significant first step towards understanding the factors that control the rates and spatial extent of wide bedrock valleys.

  17. Developing and exploring a theory for the lateral erosion of bedrock channels for use in landscape evolution models

    Science.gov (United States)

    Langston, Abigail L.; Tucker, Gregory E.

    2018-01-01

    Understanding how a bedrock river erodes its banks laterally is a frontier in geomorphology. Theories for the vertical incision of bedrock channels are widely implemented in the current generation of landscape evolution models. However, in general existing models do not seek to implement the lateral migration of bedrock channel walls. This is problematic, as modeling geomorphic processes such as terrace formation and hillslope-channel coupling depends on the accurate simulation of valley widening. We have developed and implemented a theory for the lateral migration of bedrock channel walls in a catchment-scale landscape evolution model. Two model formulations are presented, one representing the slow process of widening a bedrock canyon and the other representing undercutting, slumping, and rapid downstream sediment transport that occurs in softer bedrock. Model experiments were run with a range of values for bedrock erodibility and tendency towards transport- or detachment-limited behavior and varying magnitudes of sediment flux and water discharge in order to determine the role that each plays in the development of wide bedrock valleys. The results show that this simple, physics-based theory for the lateral erosion of bedrock channels produces bedrock valleys that are many times wider than the grid discretization scale. This theory for the lateral erosion of bedrock channel walls and the numerical implementation of the theory in a catchment-scale landscape evolution model is a significant first step towards understanding the factors that control the rates and spatial extent of wide bedrock valleys.

  18. Scattering and short-distance properties in field theory models

    International Nuclear Information System (INIS)

    Iagolnitzer, D.

    1987-01-01

    The aim of constructive field theory is not only to define models but also to establish their general properties of physical interest. We here review recent works on scattering and on short-distance properties for weakly coupled theories with mass gap such as typically P(φ) in dimension 2, φ 4 in dimension 3 and the (renormalizable, asymptotically free) massive Gross-Neveu (GN) model in dimension 2. Many of the ideas would apply similarly to other (possibly non renormalizable) theories that might be defined in a similar way via phase-space analysis

  19. Time-dependent perturbation theory for nonequilibrium lattice models

    International Nuclear Information System (INIS)

    Jensen, I.; Dickman, R.

    1993-01-01

    The authors develop a time-dependent perturbation theory for nonequilibrium interacting particle systems. They focus on models such as the contact process which evolve via destruction and autocatalytic creation of particles. At a critical value of the destruction rate there is a continuous phase transition between an active steady state and the vacuum state, which is absorbing. They present several methods for deriving series for the evolution starting from a single seed particle, including expansions for the ultimate survival probability in the super- and subcritical regions, expansions for the average number of particles in the subcritical region, and short-time expansions. Algorithms for computer generation of the various expansions are presented. Rather long series (24 terms or more) and precise estimates of critical parameters are presented. 45 refs., 4 figs., 9 tabs

  20. Foundations of compositional model theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2011-01-01

    Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf

  1. Diffusive epidemic process: theory and simulation

    International Nuclear Information System (INIS)

    Maia, Daniel Souza; Dickman, Ronald

    2007-01-01

    We study the continuous absorbing-state phase transition in the one-dimensional diffusive epidemic process via mean-field theory and Monte Carlo simulation. In this model, particles of two species (A and B) hop on a lattice and undergo reactions B → A and A+B → 2B; the total particle number is conserved. We formulate the model as a continuous-time Markov process described by a master equation. A phase transition between the (absorbing) B-free state and an active state is observed as the parameters (reaction and diffusion rates, and total particle density) are varied. Mean-field theory reveals a surprising, nonmonotonic dependence of the critical recovery rate on the diffusion rate of B particles. A computational realization of the process that is faithful to the transition rates defining the model is devised, allowing for direct comparison with theory. Using the quasi-stationary simulation method we determine the order parameter and the survival time in systems of up to 4000 sites. Due to strong finite-size effects, the results converge only for large system sizes. We find no evidence for a discontinuous transition. Our results are consistent with the existence of three distinct universality classes, depending on whether A particles diffusive more rapidly, less rapidly or at the same rate as B particles. We also perform quasi-stationary simulations of the triplet creation model, which yield results consistent with a discontinuous transition at high diffusion rates

  2. Experimental Study and Modelling of Poly (Methyl Methacrylate) and Polycarbonate Compressive Behavior from Low to High Strain Rates

    Science.gov (United States)

    El-Qoubaa, Z.; Colard, L.; Matadi Boumbimba, R.; Rusinek, A.

    2018-03-01

    This paper concerns an experimental investigation of Polycarbonate and Poly (methyl methacrylate) compressive behavior from low to high strain rates. Experiments were conducted from 0.001/s to ≈ 5000/s for PC and from 0.001/s to ≈ 2000/s for PMMA. The true strain-stress behavior is established and analyzed at various stain rates. Both PC and PMMA mechanical behavior appears as known, to be strain rate and temperature dependent. The DSGZ model is selected for modelling the strain-stress curves while the yield stress is reproduced using the cooperative model and a modified Eyring equation based on Eyring first process theory. All the three models predictions are in agreement with experiments performed on PC and PMMA.

  3. Robust global identifiability theory using potentials--Application to compartmental models.

    Science.gov (United States)

    Wongvanich, N; Hann, C E; Sirisena, H R

    2015-04-01

    This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Mathematical finance theory review and exercises from binomial model to risk measures

    CERN Document Server

    Gianin, Emanuela Rosazza

    2013-01-01

    The book collects over 120 exercises on different subjects of Mathematical Finance, including Option Pricing, Risk Theory, and Interest Rate Models. Many of the exercises are solved, while others are only proposed. Every chapter contains an introductory section illustrating the main theoretical results necessary to solve the exercises. The book is intended as an exercise textbook to accompany graduate courses in mathematical finance offered at many universities as part of degree programs in Applied and Industrial Mathematics, Mathematical Engineering, and Quantitative Finance.

  5. Application of activated barrier hopping theory to viscoplastic modeling of glassy polymers

    Science.gov (United States)

    Sweeney, J.; Spencer, P. E.; Vgenopoulos, D.; Babenko, M.; Boutenel, F.; Caton-Rose, P.; Coates, P. D.

    2017-10-01

    An established statistical mechanical theory of amorphous polymer deformation has been incorporated as a plastic mechanism into a constitutive model and applied to a range of polymer mechanical deformations. The temperature and rate dependence of the tensile yield of PVC, as reported in early studies, has been modeled to high levels of accuracy. Tensile experiments on PET reported here are analyzed similarly and good accuracy is also achieved. The frequently observed increase in the gradient of the plot of yield stress against logarithm of strain rate is an inherent feature of the constitutive model. The form of temperature dependence of the yield that is predicted by the model is found to give an accurate representation. The constitutive model is developed in two-dimensional form and implemented as a user-defined subroutine in the finite element package ABAQUS. This analysis is applied to the tensile experiments on PET, in some of which strain is localized in the form of shear bands and necks. These deformations are modeled with partial success, though adiabatic heating of the instability causes inaccuracies for this isothermal implementation of the model. The plastic mechanism has advantages over the Eyring process, is equally tractable, and presents no particular difficulties in implementation with finite elements.

  6. Theories of conduct disorder: a causal modelling analysis

    NARCIS (Netherlands)

    Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De

    2004-01-01

    Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –

  7. Modelling non-ignorable missing data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cornelis A.W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  8. Modelling non-ignorable missing-data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cees A. W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  9. Nematic elastomers: from a microscopic model to macroscopic elasticity theory.

    Science.gov (United States)

    Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette

    2008-05-01

    A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.

  10. Spatial interaction models facility location using game theory

    CERN Document Server

    D'Amato, Egidio; Pardalos, Panos

    2017-01-01

    Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.

  11. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    Science.gov (United States)

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  13. An anthology of theories and models of design philosophy, approaches and empirical explorations

    CERN Document Server

    Blessing, Lucienne

    2014-01-01

    While investigations into both theories and models has remained a major strand of engineering design research, current literature sorely lacks a reference book that provides a comprehensive and up-to-date anthology of theories and models, and their philosophical and empirical underpinnings; An Anthology of Theories and Models of Design fills this gap. The text collects the expert views of an international authorship, covering: ·         significant theories in engineering design, including CK theory, domain theory, and the theory of technical systems; ·         current models of design, from a function behavior structure model to an integrated model; ·         important empirical research findings from studies into design; and ·         philosophical underpinnings of design itself. For educators and researchers in engineering design, An Anthology of Theories and Models of Design gives access to in-depth coverage of theoretical and empirical developments in this area; for pr...

  14. Time-dependent shell-model theory of dissipative heavy-ion collisions

    International Nuclear Information System (INIS)

    Ayik, S.; Noerenberg, W.

    1982-01-01

    A transport theory is formulated within a time-dependent shell-model approach. Time averaging of the equations for macroscopic quantities lead to irreversibility and justifies weak-coupling limit and Markov approximation for the (energy-conserving) one- and two-body collision terms. Two coupled equations for the occupation probabilities of dynamical single-particle states and for the collective variable are derived and explicit formulas for transition rates, dynamical forces, mass parameters and friction coefficients are given. The applicability of the formulation in terms of characteristic quantities of nuclear systems is considered in detail and some peculiarities due to memory effects in the initial equilibration process of heavy-ion collisions are discussed. (orig.)

  15. Spin foam model for pure gauge theory coupled to quantum gravity

    International Nuclear Information System (INIS)

    Oriti, Daniele; Pfeiffer, Hendryk

    2002-01-01

    We propose a spin foam model for pure gauge fields coupled to Riemannian quantum gravity in four dimensions. The model is formulated for the triangulation of a four-manifold which is given merely combinatorially. The Riemannian Barrett-Crane model provides the gravity sector of our model and dynamically assigns geometric data to the given combinatorial triangulation. The gauge theory sector is a lattice gauge theory living on the same triangulation and obtains from the gravity sector the geometric information which is required to calculate the Yang-Mills action. The model is designed so that one obtains a continuum approximation of the gauge theory sector at an effective level, similarly to the continuum limit of lattice gauge theory, when the typical length scale of gravity is much smaller than the Yang-Mills scale

  16. A model of PCF in guarded type theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  17. A Model of PCF in Guarded Type Theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  18. Mechanical strength model for plastic bonded granular materials at high strain rates and large strains

    International Nuclear Information System (INIS)

    Browning, R.V.; Scammon, R.J.

    1998-01-01

    Modeling impact events on systems containing plastic bonded explosive materials requires accurate models for stress evolution at high strain rates out to large strains. For example, in the Steven test geometry reactions occur after strains of 0.5 or more are reached for PBX-9501. The morphology of this class of materials and properties of the constituents are briefly described. We then review the viscoelastic behavior observed at small strains for this class of material, and evaluate large strain models used for granular materials such as cap models. Dilatation under shearing deformations of the PBX is experimentally observed and is one of the key features modeled in cap style plasticity theories, together with bulk plastic flow at high pressures. We propose a model that combines viscoelastic behavior at small strains but adds intergranular stresses at larger strains. A procedure using numerical simulations and comparisons with results from flyer plate tests and low rate uniaxial stress tests is used to develop a rough set of constants for PBX-9501. Comparisons with the high rate flyer plate tests demonstrate that the observed characteristic behavior is captured by this viscoelastic based model. copyright 1998 American Institute of Physics

  19. Game Theory and its Relationship with Linear Programming Models ...

    African Journals Online (AJOL)

    Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.

  20. Revalidating the Arabic Scale for Teachers' Ratings of Basic Education Gifted Students' Characteristics Using Rasch Modeling

    Directory of Open Access Journals (Sweden)

    Salah Eldin Farah Atallah Bakheit

    2013-12-01

    Full Text Available The Arabic scale for teachers' ratings of basic education gifted students' characteristics is one of the most common Arabic measures used for initial identification of gifted students in some Arabic countries. One of the shortcomings of this scale is that it is based on the classical the-ory of measurement. This study sought to reval-idate the scale in the light of Rasch modeling which rests upon the modern theory of meas-urement and to develop different criteria for in-terpreting the levels of individuals' traits. The scale was administered to 830 of Basic Educa-tion students in Khartoum (ages ranged from 7 to 12 years. Two groups of students partici-pated in the study: a calibration sample (N = 250 and a standardization sample (N = 580. The statistical treatments were performed using the PSAW 18 and RUMM 2020 programs ac-cording to Rasch's unidimentional model. Six of the scale's items were deleted for not conform-ing to Rasch Modeling. This left the scale with 31 items. Besides, new criteria for the scale were developed by obtaining the t-scores and special education scores that match the various ratings of the individuals' ability.

  1. sigma model approach to the heterotic string theory

    International Nuclear Information System (INIS)

    Sen, A.

    1985-09-01

    Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs

  2. The Prediction of Exchange Rates with the Use of Auto-Regressive Integrated Moving-Average Models

    Directory of Open Access Journals (Sweden)

    Daniela Spiesová

    2014-10-01

    Full Text Available Currency market is recently the largest world market during the existence of which there have been many theories regarding the prediction of the development of exchange rates based on macroeconomic, microeconomic, statistic and other models. The aim of this paper is to identify the adequate model for the prediction of non-stationary time series of exchange rates and then use this model to predict the trend of the development of European currencies against Euro. The uniqueness of this paper is in the fact that there are many expert studies dealing with the prediction of the currency pairs rates of the American dollar with other currency but there is only a limited number of scientific studies concerned with the long-term prediction of European currencies with the help of the integrated ARMA models even though the development of exchange rates has a crucial impact on all levels of economy and its prediction is an important indicator for individual countries, banks, companies and businessmen as well as for investors. The results of this study confirm that to predict the conditional variance and then to estimate the future values of exchange rates, it is adequate to use the ARIMA (1,1,1 model without constant, or ARIMA [(1,7,1,(1,7] model, where in the long-term, the square root of the conditional variance inclines towards stable value.

  3. Modeling the Volatility of Exchange Rates: GARCH Models

    Directory of Open Access Journals (Sweden)

    Fahima Charef

    2017-03-01

    Full Text Available The modeling of the dynamics of the exchange rate at a long time remains a financial and economic research center. In our research we tried to study the relationship between the evolution of exchange rates and macroeconomic fundamentals. Our empirical study is based on a series of exchange rates for the Tunisian dinar against three currencies of major trading partners (dollar, euro, yen and fundamentals (the terms of trade, the inflation rate, the interest rate differential, of monthly data, from jan 2000 to dec-2014, for the case of the Tunisia. We have adopted models of conditional heteroscedasticity (ARCH, GARCH, EGARCH, TGARCH. The results indicate that there is a partial relationship between the evolution of the Tunisian dinar exchange rates and macroeconomic variables.

  4. The logical foundations of scientific theories languages, structures, and models

    CERN Document Server

    Krause, Decio

    2016-01-01

    This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...

  5. Planar N = 4 gauge theory and the Hubbard model

    International Nuclear Information System (INIS)

    Rej, Adam; Serban, Didina; Staudacher, Matthias

    2006-01-01

    Recently it was established that a certain integrable long-range spin chain describes the dilatation operator of N = 4 gauge theory in the su(2) sector to at least three-loop order, while exhibiting BMN scaling to all orders in perturbation theory. Here we identify this spin chain as an approximation to an integrable short-ranged model of strongly correlated electrons: The Hubbard model

  6. Models with oscillator terms in noncommutative quantum field theory

    International Nuclear Information System (INIS)

    Kronberger, E.

    2010-01-01

    The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de

  7. Integrable lambda models and Chern-Simons theories

    International Nuclear Information System (INIS)

    Schmidtt, David M.

    2017-01-01

    In this note we reveal a connection between the phase space of lambda models on S 1 ×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 ×S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  8. The contagious nature of imprisonment: an agent-based model to explain racial disparities in incarceration rates.

    Science.gov (United States)

    Lum, Kristian; Swarup, Samarth; Eubank, Stephen; Hawdon, James

    2014-09-06

    We build an agent-based model of incarceration based on the susceptible-infected-suspectible (SIS) model of infectious disease propagation. Our central hypothesis is that the observed racial disparities in incarceration rates between Black and White Americans can be explained as the result of differential sentencing between the two demographic groups. We demonstrate that if incarceration can be spread through a social influence network, then even relatively small differences in sentencing can result in large disparities in incarceration rates. Controlling for effects of transmissibility, susceptibility and influence network structure, our model reproduces the observed large disparities in incarceration rates given the differences in sentence lengths for White and Black drug offenders in the USA without extensive parameter tuning. We further establish the suitability of the SIS model as applied to incarceration by demonstrating that the observed structural patterns of recidivism are an emergent property of the model. In fact, our model shows a remarkably close correspondence with California incarceration data. This work advances efforts to combine the theories and methods of epidemiology and criminology.

  9. Stochastic quantization of field theories on the lattice and supersymmetrical models

    International Nuclear Information System (INIS)

    Aldazabal, Gerardo.

    1984-01-01

    Several aspects of the stochastic quantization method are considered. Specifically, field theories on the lattice and supersymmetrical models are studied. A non-linear sigma model is studied firstly, and it is shown that it is possible to obtain evolution equations written directly for invariant quantities. These ideas are generalized to obtain Langevin equations for the Wilson loops of non-abelian lattice gauge theories U (N) and SU (N). In order to write these equations, some different ways of introducing the constraints which the fields must satisfy are discussed. It is natural to have a strong coupling expansion in these equations. The correspondence with quantum field theory is established, and it is noticed that at all orders in the perturbation theory, Langevin equations reduce to Schwinger-Dyson equations. From another point of view, stochastic quantization is applied to large N matrix models on the lattice. As a result, a simple and systematic way of building reduced models is found. Referring to stochastic quantization in supersymmetric theories, a simple supersymmetric model is studied. It is shown that it is possible to write an evolution equation for the superfield wich leads to quantum field theory results in equilibrium. As the Langevin equation preserves supersymmetry, the property of dimensional reduction known for the quantum model is shown to be valid at all times. (M.E.L.) [es

  10. Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment

    Science.gov (United States)

    Marcus, R. A.

    1964-01-01

    In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.

  11. Biomedical Progress Rates as New Parameters for Models of Economic Growth in Developed Countries

    Directory of Open Access Journals (Sweden)

    Alex Zhavoronkov

    2013-11-01

    Full Text Available While the doubling of life expectancy in developed countries during the 20th century can be attributed mostly to decreases in child mortality, the trillions of dollars spent on biomedical research by governments, foundations and corporations over the past sixty years are also yielding longevity dividends in both working and retired population. Biomedical progress will likely increase the healthy productive lifespan and the number of years of government support in the old age. In this paper we introduce several new parameters that can be applied to established models of economic growth: the biomedical progress rate, the rate of clinical adoption and the rate of change in retirement age. The biomedical progress rate is comprised of the rejuvenation rate (extending the productive lifespan and the non-rejuvenating rate (extending the lifespan beyond the age at which the net contribution to the economy becomes negative. While staying within the neoclassical economics framework and extending the overlapping generations (OLG growth model and assumptions from the life cycle theory of saving behavior, we provide an example of the relations between these new parameters in the context of demographics, labor, households and the firm.

  12. Biomedical progress rates as new parameters for models of economic growth in developed countries.

    Science.gov (United States)

    Zhavoronkov, Alex; Litovchenko, Maria

    2013-11-08

    While the doubling of life expectancy in developed countries during the 20th century can be attributed mostly to decreases in child mortality, the trillions of dollars spent on biomedical research by governments, foundations and corporations over the past sixty years are also yielding longevity dividends in both working and retired population. Biomedical progress will likely increase the healthy productive lifespan and the number of years of government support in the old age. In this paper we introduce several new parameters that can be applied to established models of economic growth: the biomedical progress rate, the rate of clinical adoption and the rate of change in retirement age. The biomedical progress rate is comprised of the rejuvenation rate (extending the productive lifespan) and the non-rejuvenating rate (extending the lifespan beyond the age at which the net contribution to the economy becomes negative). While staying within the neoclassical economics framework and extending the overlapping generations (OLG) growth model and assumptions from the life cycle theory of saving behavior, we provide an example of the relations between these new parameters in the context of demographics, labor, households and the firm.

  13. Spin foam models of Yang-Mills theory coupled to gravity

    International Nuclear Information System (INIS)

    Mikovic, A

    2003-01-01

    We construct a spin foam model of Yang-Mills theory coupled to gravity by using a discretized path integral of the BF theory with polynomial interactions and the Barrett-Crane ansatz. In the Euclidean gravity case, we obtain a vertex amplitude which is determined by a vertex operator acting on a simple spin network function. The Euclidean gravity results can be straightforwardly extended to the Lorentzian case, so that we propose a Lorentzian spin foam model of Yang-Mills theory coupled to gravity

  14. Working memory: theories, models, and controversies.

    Science.gov (United States)

    Baddeley, Alan

    2012-01-01

    I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.

  15. On Optimizing H. 264/AVC Rate Control by Improving R-D Model and Incorporating HVS Characteristics

    Directory of Open Access Journals (Sweden)

    Jiang Gangyi

    2010-01-01

    Full Text Available The state-of-the-art JVT-G012 rate control algorithm of H.264 is improved from two aspects. First, the quadratic rate-distortion (R-D model is modified based on both empirical observations and theoretical analysis. Second, based on the existing physiological and psychological research findings of human vision, the rate control algorithm is optimized by incorporating the main characteristics of the human visual system (HVS such as contrast sensitivity, multichannel theory, and masking effect. Experiments are conducted, and experimental results show that the improved algorithm can simultaneously enhance the overall subjective visual quality and improve the rate control precision effectively.

  16. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  17. Rate Coefficient for the (4)Heμ + CH4 Reaction at 500 K: Comparison between Theory and Experiment.

    Science.gov (United States)

    Arseneau, Donald J; Fleming, Donald G; Li, Yongle; Li, Jun; Suleimanov, Yury V; Guo, Hua

    2016-03-03

    The rate constant for the H atom abstraction reaction from methane by the muonic helium atom, Heμ + CH4 → HeμH + CH3, is reported at 500 K and compared with theory, providing an important test of both the potential energy surface (PES) and reaction rate theory for the prototypical polyatomic CH5 reaction system. The theory used to characterize this reaction includes both variational transition-state (CVT/μOMT) theory (VTST) and ring polymer molecular dynamics (RPMD) calculations on a recently developed PES, which are compared as well with earlier calculations on different PESs for the H, D, and Mu + CH4 reactions, the latter, in particular, providing for a variation in atomic mass by a factor of 36. Though rigorous quantum calculations have been carried out for the H + CH4 reaction, these have not yet been extended to the isotopologues of this reaction (in contrast to H3), so it is important to provide tests of less rigorous theories in comparison with kinetic isotope effects measured by experiment. In this regard, the agreement between the VTST and RPMD calculations and experiment for the rate constant of the Heμ + CH4 reaction at 500 K is excellent, within 10% in both cases, which overlaps with experimental error.

  18. Linking Statistical and Ecological Theory

    NARCIS (Netherlands)

    Harris, Keith; Parsons, Todd L.; Ijaz, Umer Z.; Lahti, Leo; Holmes, Ian; Quince, Christopher

    2017-01-01

    Neutral models which assume ecological equivalence between species provide null models for community assembly. In Hubbell's unified neutral theory of biodiversity (UNTB), many local communities are connected to a single metacommunity through differing immigration rates. Our ability to fit the

  19. Matrix models and stochastic growth in Donaldson-Thomas theory

    Energy Technology Data Exchange (ETDEWEB)

    Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)

    2012-10-15

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  20. Matrix models and stochastic growth in Donaldson-Thomas theory

    International Nuclear Information System (INIS)

    Szabo, Richard J.; Tierz, Miguel

    2012-01-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  1. Soliton excitations in polyacetylene and relativistic field theory models

    International Nuclear Information System (INIS)

    Campbell, D.K.; Bishop, A.R.; Los Alamos Scientific Lab., NM

    1982-01-01

    A continuum model of a Peierls-dimerized chain, as described generally by Brazovskii and discussed for the case of polyacetylene by Takayama, Lin-Liu and Maki (TLM), is considered. The continuum (Bogliubov-de Gennes) equations arising in this model of interacting electrons and phonons are shown to be equivalent to the static, semiclassical equations for a solvable model field theory of self-coupled fermions - the N = 2 Gross-Neveu model. Based on this equivalence we note the existence of soliton defect states in polyacetylene that are additional to, and qualitatively different from, the amplitude kinks commonly discussed. The new solutions do not have the topological stability of kinks but are essentially conventional strong-coupling polarons in the dimerized chain. They carry spin (1/2) and charge (+- e). In addition, we discuss further areas in which known field theory results may apply to a Peierls-dimerized chain, including relations between phenomenological PHI 4 and continuuum electron-phonon models, and the structure of the fully quantum versus mean field theories. (orig.)

  2. A survey on the modeling and applications of cellular automata theory

    Science.gov (United States)

    Gong, Yimin

    2017-09-01

    The Cellular Automata Theory is a discrete model which is now widely used in scientific researches and simulations. The model is comprised of some cells which changes according to a specific rule over time. This paper provides a survey of the Modeling and Applications of Cellular Automata Theory, which focus on the program realization of Cellular Automata Theory and the application of Cellular Automata in each field, such as road traffic, land use, and cutting machines. Each application is further explained, and several related main models are briefly introduced. This research aims to help decision-makers formulate appropriate development plans.

  3. Applications of the absolute reaction rate theory to biological responses in electric and magnetic fields

    International Nuclear Information System (INIS)

    Brannen, J.P.; Wayland, J.R.

    1976-01-01

    This paper develops a theoretical foundation for the study of biological responses of electric and magnetic fields. The basis of the development is the absolute reaction rate theory and the effects of fields on reaction rates. A simple application to the response of Bacillus subtilis var niger in a microwave field is made. Potential areas of application are discussed

  4. Introduction to zeolite theory and modelling

    NARCIS (Netherlands)

    Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.

    2001-01-01

    A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the

  5. Integrable models in 1+1 dimensional quantum field theory

    International Nuclear Information System (INIS)

    Faddeev, Ludvig.

    1982-09-01

    The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR

  6. Integrable lambda models and Chern-Simons theories

    Energy Technology Data Exchange (ETDEWEB)

    Schmidtt, David M. [Departamento de Física, Universidade Federal de São Carlos,Caixa Postal 676, CEP 13565-905, São Carlos-SP (Brazil)

    2017-05-03

    In this note we reveal a connection between the phase space of lambda models on S{sup 1}×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS{sub 5}×S{sup 5} lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  7. A dynamical theory for the Rishon model

    International Nuclear Information System (INIS)

    Harari, H.; Seiberg, N.

    1980-09-01

    We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)

  8. Self Modeling: Expanding the Theories of Learning

    Science.gov (United States)

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  9. [The theory of the demographic transition as a reference for demo-economic models].

    Science.gov (United States)

    Genne, M

    1981-01-01

    The aim of the theory of demographic transition (TTD) is to better understand the behavior and interrelationship of economic and demographic variables. There are 2 types of demo-economic models: 1) the malthusian models, which consider demographic variables as pure exogenous variables, and 2) the neoclassical models, which consider demographic variables as strictly endogenous. If TTD can explore the behavior of exogenous and endogenous demographic variables, it cannot demonstrate neither the relation nor the order of causality among the various demographic and economic variables, but it is simply the theoretical framework of a complex social and economic phenomenon which started in Europe in the 19th Century, and which today can be extended to developing countries. There are 4 stages in the TTD; the 1st stage is characterized by high levels of fecundity and mortality; the 2nd stage is characterized by high fecundity levels and declining mortality levels; the 3rd stage is characterized by declining fecundity levels and low mortality levels; the 4th stage is characterized by low fertility and mortality levels. The impact of economic variables over mortality and birth rates is evident for mortality rates, which decline earlier and at a greater speed than birth rates. According to reliable mathematical predictions, around the year 1987 mortality rates in developing countries will have reached the low level of European countries, and growth rate will be only 1.5%. If the validity of demo-economic models has not yet been established, TTD has clearly shown that social and economic development is the factor which influences demographic expansion.

  10. Membrane models and generalized Z2 gauge theories

    International Nuclear Information System (INIS)

    Lowe, M.J.; Wallace, D.J.

    1980-01-01

    We consider models of (d-n)-dimensional membranes fluctuating in a d-dimensional space under the action of surface tension. We investigate the renormalization properties of these models perturbatively and in 1/n expansion. The potential relationships of these models to generalized Z 2 gauge theories are indicated. (orig.)

  11. Integrability of a family of quantum field theories related to sigma models

    Energy Technology Data Exchange (ETDEWEB)

    Ridout, David [Australian National Univ., Canberra, ACT (Australia). Dept. of Theoretical Physics; DESY, Hamburg (Germany). Theory Group; Teschner, Joerg [DESY, Hamburg (Germany). Theory Group

    2011-03-15

    A method is introduced for constructing lattice discretizations of large classes of integrable quantum field theories. The method proceeds in two steps: The quantum algebraic structure underlying the integrability of the model is determined from the algebra of the interaction terms in the light-cone representation. The representation theory of the relevant quantum algebra is then used to construct the basic ingredients of the quantum inverse scattering method, the lattice Lax matrices and R-matrices. This method is illustrated with four examples: The Sinh-Gordon model, the affine sl(3) Toda model, a model called the fermionic sl(2 vertical stroke 1) Toda theory, and the N=2 supersymmetric Sine-Gordon model. These models are all related to sigma models in various ways. The N=2 supersymmetric Sine-Gordon model, in particular, describes the Pohlmeyer reduction of string theory on AdS{sub 2} x S{sup 2}, and is dual to a supersymmetric non-linear sigma model with a sausage-shaped target space. (orig.)

  12. Models and theories of prescribing decisions: A review and suggested a new model

    Science.gov (United States)

    Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701

  13. Models and theories of prescribing decisions: A review and suggested a new model

    Directory of Open Access Journals (Sweden)

    Ali Murshid M

    2017-06-01

    Full Text Available To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  14. Targeting the Real Exchange Rate; Theory and Evidence

    OpenAIRE

    Carlos A. Végh Gramont; Guillermo Calvo; Carmen Reinhart

    1994-01-01

    This paper presents a theoretical and empirical analysis of policies aimed at setting a more depreciated level of the real exchange rate. An intertemporal optimizing model suggests that, in the absence of changes in fiscal policy, a more depreciated level of the real exchange can only be attained temporarily. This can be achieved by means of higher inflation and/or higher real interest rates, depending on the degree of capital mobility. Evidence for Brazil, Chile, and Colombia supports the mo...

  15. Tracer kinetics: Modelling by partial differential equations of inhomogeneous compartments with age-dependent elimination rates. Pt. 2

    International Nuclear Information System (INIS)

    Winkler, E.

    1991-01-01

    The general theory of inhomogeneous compartments with age-dependent elimination rates is illustrated by examples. Mathematically, it turns out that models consisting of partial differential equations include ordinary, delayed and integro-differential equations, a general fact which is treated here in the context of linear tracer kinetics. The examples include standard compartments as a degenerate case, systems of standard compartments (compartment blocks), models resulting in special residence time distributions, models with pipes, and systems with heterogeneous particles. (orig./BBR) [de

  16. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  17. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  18. Glass Durability Modeling, Activated Complex Theory (ACT)

    International Nuclear Information System (INIS)

    CAROL, JANTZEN

    2005-01-01

    The most important requirement for high-level waste glass acceptance for disposal in a geological repository is the chemical durability, expressed as a glass dissolution rate. During the early stages of glass dissolution in near static conditions that represent a repository disposal environment, a gel layer resembling a membrane forms on the glass surface through which ions exchange between the glass and the leachant. The hydrated gel layer exhibits acid/base properties which are manifested as the pH dependence of the thickness and nature of the gel layer. The gel layer has been found to age into either clay mineral assemblages or zeolite mineral assemblages. The formation of one phase preferentially over the other has been experimentally related to changes in the pH of the leachant and related to the relative amounts of Al +3 and Fe +3 in a glass. The formation of clay mineral assemblages on the leached glass surface layers ,lower pH and Fe +3 rich glasses, causes the dissolution rate to slow to a long-term steady state rate. The formation of zeolite mineral assemblages ,higher pH and Al +3 rich glasses, on leached glass surface layers causes the dissolution rate to increase and return to the initial high forward rate. The return to the forward dissolution rate is undesirable for long-term performance of glass in a disposal environment. An investigation into the role of glass stoichiometry, in terms of the quasi-crystalline mineral species in a glass, has shown that the chemistry and structure in the parent glass appear to control the activated surface complexes that form in the leached layers, and these mineral complexes ,some Fe +3 rich and some Al +3 rich, play a role in whether or not clays or zeolites are the dominant species formed on the leached glass surface. The chemistry and structure, in terms of Q distributions of the parent glass, are well represented by the atomic ratios of the glass forming components. Thus, glass dissolution modeling using simple

  19. Interest Rates and Inflation

    OpenAIRE

    Coopersmith, Michael

    2011-01-01

    A relation between interest rates and inflation is presented using a two component economic model and a simple general principle. Preliminary results indicate a remarkable similarity to classical economic theories, in particular that of Wicksell.

  20. Superfield theory and supermatrix model

    International Nuclear Information System (INIS)

    Park, Jeong-Hyuck

    2003-01-01

    We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)

  1. σ-models and string theories

    International Nuclear Information System (INIS)

    Randjbar-Daemi, S.

    1987-01-01

    The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix

  2. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D

    2014-01-01

    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  3. Integer, fractional, and anomalous quantum Hall effects explained with Eyring's rate process theory and free volume concept.

    Science.gov (United States)

    Hao, Tian

    2017-02-22

    The Hall effects, especially the integer, fractional and anomalous quantum Hall effects, have been addressed using Eyring's rate process theory and free volume concept. The basic assumptions are that the conduction process is a common rate controlled "reaction" process that can be described with Eyring's absolute rate process theory; the mobility of electrons should be dependent on the free volume available for conduction electrons. The obtained Hall conductivity is clearly quantized as with prefactors related to both the magnetic flux quantum number and the magnetic quantum number via the azimuthal quantum number, with and without an externally applied magnetic field. This article focuses on two dimensional (2D) systems, but the approaches developed in this article can be extended to 3D systems.

  4. A signal detection-item response theory model for evaluating neuropsychological measures.

    Science.gov (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G

    2018-02-05

    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  5. Diagrammatic group theory in quark models

    International Nuclear Information System (INIS)

    Canning, G.P.

    1977-05-01

    A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de

  6. Chaos Theory as a Model for Managing Issues and Crises.

    Science.gov (United States)

    Murphy, Priscilla

    1996-01-01

    Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…

  7. Discrete state moduli of string theory from c=1 matrix model

    CERN Document Server

    Dhar, A; Wadia, S R; Dhar, Avinash; Mandal, Gautam; Wadia, Spenta R

    1995-01-01

    We propose a new formulation of the space-time interpretation of the c=1 matrix model. Our formulation uses the well-known leg-pole factor that relates the matrix model amplitudes to that of the 2-dimensional string theory, but includes fluctuations around the fermi vacuum on {\\sl both sides} of the inverted harmonic oscillator potential of the double-scaled model, even when the fluctuations are small and confined entirely within the asymptotes in the phase plane. We argue that including fluctuations on both sides of the potential is essential for a consistent interpretation of the leg-pole transformed theory as a theory of space-time gravity. We reproduce the known results for the string theory tree level scattering amplitudes for flat space and linear dilaton background as a special case. We show that the generic case corresponds to more general space-time backgrounds. In particular, we identify the parameter corresponding to background metric perturbation in string theory (black hole mass) in terms of the ...

  8. Nonperturbative type IIB model building in the F-theory framework

    Energy Technology Data Exchange (ETDEWEB)

    Jurke, Benjamin Helmut Friedrich

    2011-02-28

    -realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  9. Nonperturbative type IIB model building in the F-theory framework

    International Nuclear Information System (INIS)

    Jurke, Benjamin Helmut Friedrich

    2011-01-01

    -realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  10. Electroweak vacuum stability in the Higgs-Dilaton theory

    Energy Technology Data Exchange (ETDEWEB)

    Shkerin, A. [Institute of Physics, Ecole Polytechnique Fédérale de Lausanne (EPFL),CH-1015, Lausanne (Switzerland); Institute for Nuclear Research of the Russian Academy of Sciences,60th October Anniversary prospect 7a, 117312, Moscow (Russian Federation)

    2017-05-30

    We study the stability of the Electroweak (EW) vacuum in a scale-invariant extension of the Standard Model and General Relativity, known as a Higgs-Dilaton theory. The safety of the EW vacuum against possible transition towards another vacuum is a necessary condition for the model to be phenomenologically acceptable. We find that, within a wide range of parameters of the theory, the decay rate is significantly suppressed compared to that of the Standard Model. We also discuss properties of a tunneling solution that are specific to the Higgs-Dilaton theory.

  11. Cohomological gauge theory, quiver matrix models and Donaldson-Thomas theoryCohomological gauge theory, quiver matrix models and Donaldson-Thomas theory

    NARCIS (Netherlands)

    Cirafici, M.; Sinkovics, A.; Szabo, R.J.

    2009-01-01

    We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques

  12. Applications of generalizability theory and their relations to classical test theory and structural equation modeling.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-03-01

    Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. The monster sporadic group and a theory underlying superstring models

    International Nuclear Information System (INIS)

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs

  14. Phenomenological rate process theory for the storage of atomic H in solid Hsub(2)sup(*)

    International Nuclear Information System (INIS)

    Rosen, G.

    1976-01-01

    A phenomenological rate process theory is developed for the storage and rapid recombination of atomic hydrogen fuel radical in a crystalline molecular hydrogen solid at temperatures in the range o.1K(<=)T(<=K. It is shown that such a theory can account quantitatively for the recently observed dependence of the storage time on the storage temperature, for the maximum concentration of trapped H atom, and for the time duration of the energy release in the tritium decay experiments of Webeler

  15. Short-run Exchange-Rate Dynamics: Theory and Evidence

    DEFF Research Database (Denmark)

    Carlson, John A.; Dahl, Christian Møller; Osler, Carol L.

    Recent research has revealed a wealth of information about the microeconomics of currency markets and thus the determination of exchange rates at short horizons. This information is valuable to us as scientists since, like evidence of macroeconomic regularities, it can provide critical guidance...... of currency markets, it accurately reflects the constraints and objectives faced by the major participants, and it fits key stylized facts concerning returns and order flow. With respect to macroeconomics, the model is consistent with most of the major puzzles that have emerged under floating rates....

  16. Effective potential in Lorentz-breaking field theory models

    Energy Technology Data Exchange (ETDEWEB)

    Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)

    2017-12-15

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  17. Effective potential in Lorentz-breaking field theory models

    International Nuclear Information System (INIS)

    Baeta Scarpelli, A.P.; Brito, L.C.T.; Felipe, J.C.C.; Nascimento, J.R.; Petrov, A.Yu.

    2017-01-01

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  18. Noncommutative gauge theory and symmetry breaking in matrix models

    International Nuclear Information System (INIS)

    Grosse, Harald; Steinacker, Harold; Lizzi, Fedele

    2010-01-01

    We show how the fields and particles of the standard model can be naturally realized in noncommutative gauge theory. Starting with a Yang-Mills matrix model in more than four dimensions, an SU(n) gauge theory on a Moyal-Weyl space arises with all matter and fields in the adjoint of the gauge group. We show how this gauge symmetry can be broken spontaneously down to SU(3) c xSU(2) L xU(1) Q [resp. SU(3) c xU(1) Q ], which couples appropriately to all fields in the standard model. An additional U(1) B gauge group arises which is anomalous at low energies, while the trace-U(1) sector is understood in terms of emergent gravity. A number of additional fields arise, which we assume to be massive, in a pattern that is reminiscent of supersymmetry. The symmetry breaking might arise via spontaneously generated fuzzy spheres, in which case the mechanism is similar to brane constructions in string theory.

  19. Off-critical statistical models: factorized scattering theories and bootstrap program

    International Nuclear Information System (INIS)

    Mussardo, G.

    1992-01-01

    We analyze those integrable statistical systems which originate from some relevant perturbations of the minimal models of conformal field theories. When only massive excitations are present, the systems can be efficiently characterized in terms of the relativistic scattering data. We review the general properties of the factorizable S-matrix in two dimensions with particular emphasis on the bootstrap principle. The classification program of the allowed spins of conserved currents and of the non-degenerate S-matrices is discussed and illustrated by means of some significant examples. The scattering theories of several massive perturbations of the minimal models are fully discussed. Among them are the Ising model, the tricritical Ising model, the Potts models, the series of the non-unitary minimal models M 2,2n+3 , the non-unitary model M 3,5 and the scaling limit of the polymer system. The ultraviolet limit of these massive integrable theories can be exploited by the thermodynamics Bethe ansatz, in particular the central charge of the original conformal theories can be recovered from the scattering data. We also consider the numerical method based on the so-called conformal space truncated approach which confirms the theoretical results and allows a direct measurement of the scattering data, i.e. the masses and the S-matrix of the particles in bootstrap interaction. The problem of computing the off-critical correlation functions is discussed in terms of the form-factor approach

  20. Dynamics of a Computer Virus Propagation Model with Delays and Graded Infection Rate

    Directory of Open Access Journals (Sweden)

    Zizhen Zhang

    2017-01-01

    Full Text Available A four-compartment computer virus propagation model with two delays and graded infection rate is investigated in this paper. The critical values where a Hopf bifurcation occurs are obtained by analyzing the distribution of eigenvalues of the corresponding characteristic equation. In succession, direction and stability of the Hopf bifurcation when the two delays are not equal are determined by using normal form theory and center manifold theorem. Finally, some numerical simulations are also carried out to justify the obtained theoretical results.

  1. The Scientific Theory Profile: A Philosophy of Science Model for Science Teachers.

    Science.gov (United States)

    Loving, Cathleen

    The model developed for use with science teachers--called the Scientific Theory Profile--consists of placing three well-known philosophers of science on a grid, with the x-axis being their methods for judging theories (rational vs. natural) and the y-axis being their views on scientific theories representing the Truth versus mere models of what…

  2. Preservice Biology Teachers' Conceptions About the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-02-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.

  3. The Self-Perception Theory vs. a Dynamic Learning Model

    OpenAIRE

    Swank, Otto H.

    2006-01-01

    Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...

  4. Optimal velocity difference model for a car-following theory

    International Nuclear Information System (INIS)

    Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.

    2011-01-01

    In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.

  5. The mass effect model of the survival rate's dose effect of organism irradiated with low energy ion beam

    International Nuclear Information System (INIS)

    Shao Chunlin; Gui Qifu; Yu Zengliang

    1995-01-01

    The main characteristic of the low energy ions mutation is its mass deposition effect. Basing on the theory of 'double strand breaking' and the 'mass deposition effect', the authors suggests that the mass deposition products can repair or further damage the double strand breaking of DNA. According to this consideration the dose effect model of the survival rate of organism irradiated by low energy of N + ion beam is deduced as: S exp{-p[αφ + βφ 2 -Rφ 2 exp(-kφ)-Lφ 3 exp(-kφ)]}, which can be called 'mass effect model'. In the low energy ion beam mutation, the dose effects of many survival rates that can not be imitated by previous models are successfully imitated by this model. The suitable application fields of the model are also discussed

  6. Models versus theories as a primary carrier of nursing knowledge: A philosophical argument.

    Science.gov (United States)

    Bender, Miriam

    2018-01-01

    Theories and models are not equivalent. I argue that an orientation towards models as a primary carrier of nursing knowledge overcomes many ongoing challenges in philosophy of nursing science, including the theory-practice divide and the paradoxical pursuit of predictive theories in a discipline that is defined by process and a commitment to the non-reducibility of the health/care experience. Scientific models describe and explain the dynamics of specific phenomenon. This is distinct from theory, which is traditionally defined as propositions that explain and/or predict the world. The philosophical case has been made against theoretical universalism, showing that a theory can be true in its domain, but that no domain is universal. Subsequently, philosophers focused on scientific models argued that they do the work of defining the boundary conditions-the domain(s)-of a theory. Further analysis has shown the ways models can be constructed and function independent of theory, meaning models can comprise distinct, autonomous "carriers of scientific knowledge." Models are viewed as representations of the active dynamics, or mechanisms, of a phenomenon. Mechanisms are entities and activities organized such that they are productive of regular changes. Importantly, mechanisms are by definition not static: change may alter the mechanism and thereby alter or create entirely new phenomena. Orienting away from theory, and towards models, focuses scholarly activity on dynamics and change. This makes models arguably critical to nursing science, enabling the production of actionable knowledge about the dynamics of process and change in health/care. I briefly explore the implications for nursing-and health/care-knowledge and practice. © 2017 John Wiley & Sons Ltd.

  7. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H

    2012-01-01

    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  8. The nearly neutral and selection theories of molecular evolution under the fisher geometrical framework: substitution rate, population size, and complexity.

    Science.gov (United States)

    Razeto-Barry, Pablo; Díaz, Javier; Vásquez, Rodrigo A

    2012-06-01

    The general theories of molecular evolution depend on relatively arbitrary assumptions about the relative distribution and rate of advantageous, deleterious, neutral, and nearly neutral mutations. The Fisher geometrical model (FGM) has been used to make distributions of mutations biologically interpretable. We explored an FGM-based molecular model to represent molecular evolutionary processes typically studied by nearly neutral and selection models, but in which distributions and relative rates of mutations with different selection coefficients are a consequence of biologically interpretable parameters, such as the average size of the phenotypic effect of mutations and the number of traits (complexity) of organisms. A variant of the FGM-based model that we called the static regime (SR) represents evolution as a nearly neutral process in which substitution rates are determined by a dynamic substitution process in which the population's phenotype remains around a suboptimum equilibrium fitness produced by a balance between slightly deleterious and slightly advantageous compensatory substitutions. As in previous nearly neutral models, the SR predicts a negative relationship between molecular evolutionary rate and population size; however, SR does not have the unrealistic properties of previous nearly neutral models such as the narrow window of selection strengths in which they work. In addition, the SR suggests that compensatory mutations cannot explain the high rate of fixations driven by positive selection currently found in DNA sequences, contrary to what has been previously suggested. We also developed a generalization of SR in which the optimum phenotype can change stochastically due to environmental or physiological shifts, which we called the variable regime (VR). VR models evolution as an interplay between adaptive processes and nearly neutral steady-state processes. When strong environmental fluctuations are incorporated, the process becomes a selection model

  9. Aspect-Aware Latent Factor Model: Rating Prediction with Ratings and Reviews

    OpenAIRE

    Cheng, Zhiyong; Ding, Ying; Zhu, Lei; Kankanhalli, Mohan

    2018-01-01

    Although latent factor models (e.g., matrix factorization) achieve good accuracy in rating prediction, they suffer from several problems including cold-start, non-transparency, and suboptimal recommendation for local users or items. In this paper, we employ textual review information with ratings to tackle these limitations. Firstly, we apply a proposed aspect-aware topic model (ATM) on the review text to model user preferences and item features from different aspects, and estimate the aspect...

  10. A study of the logical model of capital market complexity theories

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.

  11. Collective learning modeling based on the kinetic theory of active particles

    Science.gov (United States)

    Burini, D.; De Lillo, S.; Gibelli, L.

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.

  12. Behavioral and social sciences theories and models: are they used in unintentional injury prevention research?

    Science.gov (United States)

    Trifiletti, L B; Gielen, A C; Sleet, D A; Hopkins, K

    2005-06-01

    Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury prevention research. The authors conducted a systematic review to evaluate the published literature from 1980 to 2001 on behavioral and social science theory applications to unintentional injury prevention and control. Electronic database searches in PubMed and PsycINFO identified articles that combined behavioral and social sciences theories and models and injury causes. The authors identified some articles that examined behavioral and social science theories and models and unintentional injury topics, but found that several important theories have never been applied to unintentional injury prevention. Among the articles identified, the PRECEDE PROCEED Model was cited most frequently, followed by the Theory of Reasoned Action/Theory of Planned Behavior and Health Belief Model. When behavioral and social sciences theories and models were applied to unintentional injury topics, they were most frequently used to guide program design, implementation or develop evaluation measures; few examples of theory testing were found. Results suggest that the use of behavioral and social sciences theories and models in unintentional injury prevention research is only marginally represented in the mainstream, peer-reviewed literature. Both the fields of injury prevention and behavioral and social sciences could benefit from greater collaborative research to enhance behavioral approaches to injury control.

  13. Non-integrable quantum field theories as perturbations of certain integrable models

    International Nuclear Information System (INIS)

    Delfino, G.; Simonetti, P.

    1996-03-01

    We approach the study of non-integrable models of two-dimensional quantum field theory as perturbations of the integrable ones. By exploiting the knowledge of the exact S-matrix and Form Factors of the integrable field theories we obtain the first order corrections to the mass ratios, the vacuum energy density and the S-matrix of the non-integrable theories. As interesting applications of the formalism, we study the scaling region of the Ising model in an external magnetic field at T ∼ T c and the scaling region around the minimal model M 2 , τ . For these models, a remarkable agreement is observed between the theoretical predictions and the data extracted by a numerical diagonalization of their Hamiltonian. (author). 41 refs, 9 figs, 1 tab

  14. Exchange Rate Fluctuation and the Nigeria Economic Growth

    Directory of Open Access Journals (Sweden)

    Lawal Adedoyin Isola

    2016-11-01

    Full Text Available The aim of this study is to investigate the impact of exchange rate fluctuation on economic growth in Nigeria within the context of four profound theories: purchasing power parity; monetary model of exchange rates; the portfolio balance approach; and the optimal currency area theory. Data was collected from the CBN statistical bulletin in Nigeria from 2003– 2013and the Autoregressive Distributed Lag (ARDL model was employed to estimate the model. In the model, real GDP (RGDP was used as the proxy for economic growth while Inflation rate (IF, Exchange rate (EXC, Interest rate (INT and Money Supply(M2 as proxies for other macroeconomic variables. The empirical results show that exchange rate fluctuation has no effect on economic growth in the long run though a short run relationship exist between the two. Based on these findings, this paper recommends that the Central bank for policy purposes should ensure that stern foreign exchange control policies are put in place in order to help in appropriate determination of the value of the exchange rate. This will in the long run help to strengthen the value of the Naira.

  15. Hidden Fermi liquid, scattering rate saturation, and Nernst effect: a dynamical mean-field theory perspective.

    Science.gov (United States)

    Xu, Wenhu; Haule, Kristjan; Kotliar, Gabriel

    2013-07-19

    We investigate the transport properties of a correlated metal within dynamical mean-field theory. Canonical Fermi liquid behavior emerges only below a very low temperature scale T(FL). Surprisingly the quasiparticle scattering rate follows a quadratic temperature dependence up to much higher temperatures and crosses over to saturated behavior around a temperature scale T(sat). We identify these quasiparticles as constituents of the hidden Fermi liquid. The non-Fermi-liquid transport above T(FL), in particular the linear-in-T resistivity, is shown to be a result of a strongly temperature dependent band dispersion. We derive simple expressions for the resistivity, Hall angle, thermoelectric power and Nernst coefficient in terms of a temperature dependent renormalized band structure and the quasiparticle scattering rate. We discuss possible tests of the dynamical mean-field theory picture of transport using ac measurements.

  16. Crossover behavior of the thermal conductance and Kramers’ transition rate theory

    Science.gov (United States)

    Velizhanin, Kirill A.; Sahu, Subin; Chien, Chih-Chun; Dubi, Yonatan; Zwolak, Michael

    2015-12-01

    Kramers’ theory frames chemical reaction rates in solution as reactants overcoming a barrier in the presence of friction and noise. For weak coupling to the solution, the reaction rate is limited by the rate at which the solution can restore equilibrium after a subset of reactants have surmounted the barrier to become products. For strong coupling, there are always sufficiently energetic reactants. However, the solution returns many of the intermediate states back to the reactants before the product fully forms. Here, we demonstrate that the thermal conductance displays an analogous physical response to the friction and noise that drive the heat current through a material or structure. A crossover behavior emerges where the thermal reservoirs dominate the conductance at the extremes and only in the intermediate region are the intrinsic properties of the lattice manifest. Not only does this shed new light on Kramers’ classic turnover problem, this result is significant for the design of devices for thermal management and other applications, as well as the proper simulation of transport at the nanoscale.

  17. Modelling the Intention to Adopt Cloud Computing Services: A Transaction Cost Theory Perspective

    Directory of Open Access Journals (Sweden)

    Ogan Yigitbasioglu

    2014-11-01

    Full Text Available This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.

  18. A physical probabilistic model to predict failure rates in buried PVC pipelines

    International Nuclear Information System (INIS)

    Davis, P.; Burn, S.; Moglia, M.; Gould, S.

    2007-01-01

    For older water pipeline materials such as cast iron and asbestos cement, future pipe failure rates can be extrapolated from large volumes of existing historical failure data held by water utilities. However, for newer pipeline materials such as polyvinyl chloride (PVC), only limited failure data exists and confident forecasts of future pipe failures cannot be made from historical data alone. To solve this problem, this paper presents a physical probabilistic model, which has been developed to estimate failure rates in buried PVC pipelines as they age. The model assumes that under in-service operating conditions, crack initiation can occur from inherent defects located in the pipe wall. Linear elastic fracture mechanics theory is used to predict the time to brittle fracture for pipes with internal defects subjected to combined internal pressure and soil deflection loading together with through-wall residual stress. To include uncertainty in the failure process, inherent defect size is treated as a stochastic variable, and modelled with an appropriate probability distribution. Microscopic examination of fracture surfaces from field failures in Australian PVC pipes suggests that the 2-parameter Weibull distribution can be applied. Monte Carlo simulation is then used to estimate lifetime probability distributions for pipes with internal defects, subjected to typical operating conditions. As with inherent defect size, the 2-parameter Weibull distribution is shown to be appropriate to model uncertainty in predicted pipe lifetime. The Weibull hazard function for pipe lifetime is then used to estimate the expected failure rate (per pipe length/per year) as a function of pipe age. To validate the model, predicted failure rates are compared to aggregated failure data from 17 UK water utilities obtained from the United Kingdom Water Industry Research (UKWIR) National Mains Failure Database. In the absence of actual operating pressure data in the UKWIR database, typical

  19. Model Uncertainty and Exchange Rate Forecasting

    NARCIS (Netherlands)

    Kouwenberg, R.; Markiewicz, A.; Verhoeks, R.; Zwinkels, R.C.J.

    2017-01-01

    Exchange rate models with uncertain and incomplete information predict that investors focus on a small set of fundamentals that changes frequently over time. We design a model selection rule that captures the current set of fundamentals that best predicts the exchange rate. Out-of-sample tests show

  20. Computing decay rates for new physics theories with FEYNRULES and MADGRAPH 5_AMC@NLO

    Science.gov (United States)

    Alwall, Johan; Duhr, Claude; Fuks, Benjamin; Mattelaer, Olivier; Öztürk, Deniz Gizem; Shen, Chia-Hsien

    2015-12-01

    We present new features of the FEYNRULES and MADGRAPH 5_AMC@NLO programs for the automatic computation of decay widths that consistently include channels of arbitrary final-state multiplicity. The implementations are generic enough so that they can be used in the framework of any quantum field theory, possibly including higher-dimensional operators. We extend at the same time the conventions of the Universal FEYNRULES Output (or UFO) format to include decay tables and information on the total widths. We finally provide a set of representative examples of the usage of the new functions of the different codes in the framework of the Standard Model, the Higgs Effective Field Theory, the Strongly Interacting Light Higgs model and the Minimal Supersymmetric Standard Model and compare the results to available literature and programs for validation purposes.

  1. Soliton excitations in a class of nonlinear field theory models

    International Nuclear Information System (INIS)

    Makhan'kov, V.G.; Fedyanin, V.K.

    1985-01-01

    Investigation results of nonlinear models of the field theory with a lagrangian are described. The theory includes models both with zero stable vacuum epsilon=1 and with condensate epsilon=-1 (of disturbed symmetry). Conditions of existence of particle-like solutions (PLS), stability of these solutions are investigated. Soliton dynamics is studied. PLS formfactors are calculated. Statistical mechanics of solitons is built and their dynamic structure factors are calculated

  2. Symmetry Breaking, Unification, and Theories Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasunori

    2009-07-31

    A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.

  3. Magnetic flux tube models in superstring theory

    CERN Document Server

    Russo, Jorge G

    1996-01-01

    Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...

  4. Models of Regge behaviour in an asymptotically free theory

    International Nuclear Information System (INIS)

    Polkinghorne, J.C.

    1976-01-01

    Two simple Feynman integral models are presented which reproduce the features expected to be of physical importance in the Regge behaviour of asymptotically free theories. Analysis confirms the result, expected on general grounds, that phi 3 in six dimensions has an essential singularity at l=-1. The extension to gauge theories is discussed. (Auth.)

  5. A model of theory-practice relations in mathematics teacher education

    DEFF Research Database (Denmark)

    Østergaard, Kaj

    2016-01-01

    The paper presents and discusses an ATD based (Chevallard, 2012) model of theory-practice relations in mathematics teacher education. The notions of didactic transposition and praxeology are combined and concretized in order to form a comprehensive model for analysing the theory......-practice problematique. It is illustrated how the model can be used both as a descriptive tool to analyse interactions between and interviews with student teachers and teachers and as a normative tool to design and redesign learning environments in teacher education in this case a lesson study context....

  6. Towards a Semantic E-Learning Theory by Using a Modelling Approach

    Science.gov (United States)

    Yli-Luoma, Pertti V. J.; Naeve, Ambjorn

    2006-01-01

    In the present study, a semantic perspective on e-learning theory is advanced and a modelling approach is used. This modelling approach towards the new learning theory is based on the four SECI phases of knowledge conversion: Socialisation, Externalisation, Combination and Internalisation, introduced by Nonaka in 1994, and involving two levels of…

  7. Mean field theory for a balanced hypercolumn model of orientation selectivity in primary visual cortex

    CERN Document Server

    Lerchner, A; Hertz, J; Ahmadi, M

    2004-01-01

    We present a complete mean field theory for a balanced state of a simple model of an orientation hypercolumn. The theory is complemented by a description of a numerical procedure for solving the mean-field equations quantitatively. With our treatment, we can determine self-consistently both the firing rates and the firing correlations, without being restricted to specific neuron models. Here, we solve the analytically derived mean-field equations numerically for integrate-and-fire neurons. Several known key properties of orientation selective cortical neurons emerge naturally from the description: Irregular firing with statistics close to -- but not restricted to -- Poisson statistics; an almost linear gain function (firing frequency as a function of stimulus contrast) of the neurons within the network; and a contrast-invariant tuning width of the neuronal firing. We find that the irregularity in firing depends sensitively on synaptic strengths. If Fano factors are bigger than 1, then they are so for all stim...

  8. A comparison of signal detection theory to the objective threshold/strategic model of unconscious perception.

    Science.gov (United States)

    Haase, Steven J; Fisk, Gary D

    2011-08-01

    A key problem in unconscious perception research is ruling out the possibility that weak conscious awareness of stimuli might explain the results. In the present study, signal detection theory was compared with the objective threshold/strategic model as explanations of results for detection and identification sensitivity in a commonly used unconscious perception task. In the task, 64 undergraduate participants detected and identified one of four briefly displayed, visually masked letters. Identification was significantly above baseline (i.e., proportion correct > .25) at the highest detection confidence rating. This result is most consistent with signal detection theory's continuum of sensory states and serves as a possible index of conscious perception. However, there was limited support for the other model in the form of a predicted "looker's inhibition" effect, which produced identification performance that was significantly below baseline. One additional result, an interaction between the target stimulus and type of mask, raised concerns for the generality of unconscious perception effects.

  9. On a Corporate Bond Pricing Model with Credit Rating Migration Risksand Stochastic Interest Rate

    Directory of Open Access Journals (Sweden)

    Jin Liang

    2017-10-01

    Full Text Available In this paper we study a corporate bond-pricing model with credit rating migration and astochastic interest rate. The volatility of bond price in the model strongly depends on potential creditrating migration and stochastic change of the interest rate. This new model improves the previousexisting models in which the interest rate is considered to be a constant. The existence, uniquenessand regularity of the solution for the model are established. Moreover, some properties includingthe smoothness of the free boundary are obtained. Furthermore, some numerical computations arepresented to illustrate the theoretical results.

  10. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...

  11. 2PI effective action for the SYK model and tensor field theories

    Science.gov (United States)

    Benedetti, Dario; Gurau, Razvan

    2018-05-01

    We discuss the two-particle irreducible (2PI) effective action for the SYK model and for tensor field theories. For the SYK model the 2PI effective action reproduces the bilocal reformulation of the model without using replicas. In general tensor field theories the 2PI formalism is the only way to obtain a bilocal reformulation of the theory, and as such is a precious instrument for the identification of soft modes and for possible holographic interpretations. We compute the 2PI action for several models, and push it up to fourth order in the 1 /N expansion for the model proposed by Witten in [1], uncovering a one-loop structure in terms of an auxiliary bilocal action.

  12. Using circuit theory to model connectivity in ecology, evolution, and conservation.

    Science.gov (United States)

    McRae, Brad H; Dickson, Brett G; Keitt, Timothy H; Shah, Viral B

    2008-10-01

    Connectivity among populations and habitats is important for a wide range of ecological processes. Understanding, preserving, and restoring connectivity in complex landscapes requires connectivity models and metrics that are reliable, efficient, and process based. We introduce a new class of ecological connectivity models based in electrical circuit theory. Although they have been applied in other disciplines, circuit-theoretic connectivity models are new to ecology. They offer distinct advantages over common analytic connectivity models, including a theoretical basis in random walk theory and an ability to evaluate contributions of multiple dispersal pathways. Resistance, current, and voltage calculated across graphs or raster grids can be related to ecological processes (such as individual movement and gene flow) that occur across large population networks or landscapes. Efficient algorithms can quickly solve networks with millions of nodes, or landscapes with millions of raster cells. Here we review basic circuit theory, discuss relationships between circuit and random walk theories, and describe applications in ecology, evolution, and conservation. We provide examples of how circuit models can be used to predict movement patterns and fates of random walkers in complex landscapes and to identify important habitat patches and movement corridors for conservation planning.

  13. On rate-state and Coulomb failure models

    Science.gov (United States)

    Gomberg, J.; Beeler, N.; Blanpied, M.

    2000-01-01

    We examine the predictions of Coulomb failure stress and rate-state frictional models. We study the change in failure time (clock advance) Δt due to stress step perturbations (i.e., coseismic static stress increases) added to "background" stressing at a constant rate (i.e., tectonic loading) at time t0. The predictability of Δt implies a predictable change in seismicity rate r(t)/r0, testable using earthquake catalogs, where r0 is the constant rate resulting from tectonic stressing. Models of r(t)/r0, consistent with general properties of aftershock sequences, must predict an Omori law seismicity decay rate, a sequence duration that is less than a few percent of the mainshock cycle time and a return directly to the background rate. A Coulomb model requires that a fault remains locked during loading, that failure occur instantaneously, and that Δt is independent of t0. These characteristics imply an instantaneous infinite seismicity rate increase of zero duration. Numerical calculations of r(t)/r0 for different state evolution laws show that aftershocks occur on faults extremely close to failure at the mainshock origin time, that these faults must be "Coulomb-like," and that the slip evolution law can be precluded. Real aftershock population characteristics also may constrain rate-state constitutive parameters; a may be lower than laboratory values, the stiffness may be high, and/or normal stress may be lower than lithostatic. We also compare Coulomb and rate-state models theoretically. Rate-state model fault behavior becomes more Coulomb-like as constitutive parameter a decreases relative to parameter b. This is because the slip initially decelerates, representing an initial healing of fault contacts. The deceleration is more pronounced for smaller a, more closely simulating a locked fault. Even when the rate-state Δt has Coulomb characteristics, its magnitude may differ by some constant dependent on b. In this case, a rate-state model behaves like a modified

  14. How robotics programs influence young women's career choices : a grounded theory model

    Science.gov (United States)

    Craig, Cecilia Dosh-Bluhm

    The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced young women's career decisions and the program's effect on engineering, physics, and computer science career interests. To test this, a study was mounted to explore how the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition (FRC) program influenced young women's college major and career choices. Career theories suggested that experiential programs coupled with supportive relationships strongly influence career decisions, especially for science, technology, engineering, and mathematics careers. The study explored how and when young women made career decisions and how the experiential program and! its mentors and role models influenced career choice. Online focus groups and interviews (online and face-to-face) with 10 female FRC alumnae and GT processes (inductive analysis, open coding, categorizations using mind maps and content clouds) were used to generate a general systems theory style model of the career decision process for these young women. The study identified gender stereotypes and other career obstacles for women. The study's conclusions include recommendations to foster connections to real-world challenges, to develop training programs for mentors, and to nurture social cohesion, a mostly untapped area. Implementing these recommendations could help grow a critical mass of women in engineering, physics, and computer science careers, a social change worth pursuing.

  15. Determinants of choice of delivery place: Testing rational choice theory and habitus theory.

    Science.gov (United States)

    Broda, Anja; Krüger, Juliane; Schinke, Stephanie; Weber, Andreas

    2018-05-07

    The current study uses two antipodal social science theories, the rational choice theory and the habitus theory, and applies these to describe how women choose between intraclinical (i.e., hospital-run birth clinics) and extraclinical (i.e., midwife-led birth centres or home births) delivery places. Data were collected in a cross-sectional questionnaire-based survey among 189 women. A list of 22 determinants, conceptualized to capture the two theoretical concepts, were rated on a 7-point Likert scale with 1 = unimportant to 7 = very important. The analytic method was structural equation modelling. A model was built, in which the rational choice theory and the habitus theory as latent variables predicted the choice of delivery place. With regards to the choice of delivery place, 89.3% of the women wanted an intraclinical and 10.7% an extraclinical delivery place at the time of their last child's birth. Significant differences between women with a choice of an intraclinical or extraclinical delivery place were found for 14 of the 22 determinants. In the structural equation model, rational choice theory determinants predicted a choice of intraclinical delivery and habitus theory determinants predicted a choice of extraclinical delivery. The two theories had diametrically opposed effects on the choice of delivery place. Women are more likely to decide on intraclinical delivery when arguments such as high medical standards, positive evaluations, or good advanced information are rated important. In contrast, women are more likely to decide on extraclinical delivery when factors such as family atmosphere during birth, friendliness of health care professionals, or consideration of the woman's interests are deemed important. A practical implication of our study is that intraclinical deliveries may be promoted by providing comprehensive information, data and facts on various delivery-related issues, while extraclinical deliveries may be fostered by healthcare

  16. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  17. A numerical evaluation of prediction accuracy of CO2 absorber model for various reaction rate coefficients

    Directory of Open Access Journals (Sweden)

    Shim S.M.

    2012-01-01

    Full Text Available The performance of the CO2 absorber column using mono-ethanolamine (MEA solution as chemical solvent are predicted by a One-Dimensional (1-D rate based model in the present study. 1-D Mass and heat balance equations of vapor and liquid phase are coupled with interfacial mass transfer model and vapor-liquid equilibrium model. The two-film theory is used to estimate the mass transfer between the vapor and liquid film. Chemical reactions in MEA-CO2-H2O system are considered to predict the equilibrium pressure of CO2 in the MEA solution. The mathematical and reaction kinetics models used in this work are calculated by using in-house code. The numerical results are validated in the comparison of simulation results with experimental and simulation data given in the literature. The performance of CO2 absorber column is evaluated by the 1-D rate based model using various reaction rate coefficients suggested by various researchers. When the rate of liquid to gas mass flow rate is about 8.3, 6.6, 4.5 and 3.1, the error of CO2 loading and the CO2 removal efficiency using the reaction rate coefficients of Aboudheir et al. is within about 4.9 % and 5.2 %, respectively. Therefore, the reaction rate coefficient suggested by Aboudheir et al. among the various reaction rate coefficients used in this study is appropriate to predict the performance of CO2 absorber column using MEA solution. [Acknowledgement. This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF, funded by the Ministry of Education, Science and Technology (2011-0017220].

  18. Collective learning modeling based on the kinetic theory of active particles.

    Science.gov (United States)

    Burini, D; De Lillo, S; Gibelli, L

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. On multiple discount rates

    OpenAIRE

    Chambers, Christopher P.; Echenique, Federico

    2016-01-01

    We propose a theory of intertemporal choice that is robust to specific assumptions on the discount rate. One class of models requires that one utility stream be chosen over another if and only if its discounted value is higher for all discount factors in a set. Another model focuses on an average discount factor. Yet another model is pessimistic, and evaluates a flow by the lowest available discounted value.

  20. A Practise-based Theory of SEIDET Smart Community Centre Model

    CSIR Research Space (South Africa)

    Phahlamohlaka, J

    2015-11-01

    Full Text Available , as it is designed using the international studies and theories. This paper presents the design of the smart community centre model. The design is described using Practice Theory concepts towards an empirical study that will be conducted using the General...

  1. Elastoplasticity theory

    CERN Document Server

    Hashiguchi, Koichi

    2014-01-01

    This book was written to serve as the standard textbook of elastoplasticity for students, engineers and researchers in the field of applied mechanics. The present second edition is improved thoroughly from the first edition by selecting the standard theories from various formulations and models, which are required to study the essentials of elastoplasticity steadily and effectively and will remain universally in the history of elastoplasticity. It opens with an explanation of vector-tensor analysis and continuum mechanics as a foundation to study elastoplasticity theory, extending over various strain and stress tensors and their rates. Subsequently, constitutive equations of elastoplastic and viscoplastic deformations for monotonic, cyclic and non-proportional loading behavior in a general rate and their applications to metals and soils are described in detail, and constitutive equations of friction behavior between solids and its application to the prediction of stick-slip phenomena are delineated. In additi...

  2. A system-theory-based model for monthly river runoff forecasting: model calibration and optimization

    Directory of Open Access Journals (Sweden)

    Wu Jianhua

    2014-03-01

    Full Text Available River runoff is not only a crucial part of the global water cycle, but it is also an important source for hydropower and an essential element of water balance. This study presents a system-theory-based model for river runoff forecasting taking the Hailiutu River as a case study. The forecasting model, designed for the Hailiutu watershed, was calibrated and verified by long-term precipitation observation data and groundwater exploitation data from the study area. Additionally, frequency analysis, taken as an optimization technique, was applied to improve prediction accuracy. Following model optimization, the overall relative prediction errors are below 10%. The system-theory-based prediction model is applicable to river runoff forecasting, and following optimization by frequency analysis, the prediction error is acceptable.

  3. Perturbation theory instead of large scale shell model calculations

    International Nuclear Information System (INIS)

    Feldmeier, H.; Mankos, P.

    1977-01-01

    Results of large scale shell model calculations for (sd)-shell nuclei are compared with a perturbation theory provides an excellent approximation when the SU(3)-basis is used as a starting point. The results indicate that perturbation theory treatment in an SU(3)-basis including 2hω excitations should be preferable to a full diagonalization within the (sd)-shell. (orig.) [de

  4. Effective-field-theory model for the fractional quantum Hall effect

    International Nuclear Information System (INIS)

    Zhang, S.C.; Hansson, T.H.; Kivelson, S.

    1989-01-01

    Starting directly from the microscopic Hamiltonian, we derive a field-theory model for the fractional quantum hall effect. By considering an approximate coarse-grained version of the same model, we construct a Landau-Ginzburg theory similar to that of Girvin. The partition function of the model exhibits cusps as a function of density and the Hall conductance is quantized at filling factors ν = (2k-1)/sup -1/ with k an arbitrary integer. At these fractions the ground state is incompressible, and the quasiparticles and quasiholes have fractional charge and obey fractional statistics. Finally, we show that the collective density fluctuations are massive

  5. Models for Theory-Based M.A. and Ph.D. Programs.

    Science.gov (United States)

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  6. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  7. Basic scattering theory

    International Nuclear Information System (INIS)

    Queen, N.M.

    1978-01-01

    This series of lectures on basic scattering theory were given as part of a course for postgraduate high energy physicists and were designed to acquaint the student with some of the basic language and formalism used for the phenomenological description of nuclear reactions and decay processes used for the study of elementary particle interactions. Well established and model independent aspects of scattering theory, which are the basis of S-matrix theory, are considered. The subject is considered under the following headings; the S-matrix, cross sections and decay rates, phase space, relativistic kinematics, the Mandelstam variables, the flux factor, two-body phase space, Dalitz plots, other kinematic plots, two-particle reactions, unitarity, the partial-wave expansion, resonances (single-channel case), multi-channel resonances, analyticity and crossing, dispersion relations, the one-particle exchange model, the density matrix, mathematical properties of the density matrix, the density matrix in scattering processes, the density matrix in decay processes, and the helicity formalism. Some exercises for the students are included. (U.K.)

  8. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  9. Theory to practice: the humanbecoming leading-following model.

    Science.gov (United States)

    Ursel, Karen L

    2015-01-01

    Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.

  10. Agent-based models for higher-order theory of mind

    NARCIS (Netherlands)

    de Weerd, Harmen; Verbrugge, Rineke; Verheij, Bart; Kamiński, Bogumił; Koloch, Grzegorz

    2014-01-01

    Agent-based models are a powerful tool for explaining the emergence of social phenomena in a society. In such models, individual agents typically have little cognitive ability. In this paper, we model agents with the cognitive ability to make use of theory of mind. People use this ability to reason

  11. A critical assessment of theories/models used in health communication for HIV/AIDS.

    Science.gov (United States)

    Airhihenbuwa, C O; Obregon, R

    2000-01-01

    Most theories and models used to develop human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) communication are based on social psychology that emphasizes individualism. Researchers including communication and health scholars are now questioning the presumed global relevance of these models and thus the need to develop innovative theories and models that take into account regional contexts. In this paper, we discuss the commonly used theories and models in HIV/AIDS communication. Furthermore, we argue that the flaws in the application of the commonly used "classical" models in health communication are because of contextual differences in locations where these models are applied. That is to say that these theories and models are being applied in contexts for which they were not designed. For example, the differences in health behaviors are often the function of culture. Therefore, culture should be viewed for its strength and not always as a barrier. The metaphorical coupling of "culture" and "barrier" needs to be exposed, deconstructed, and reconstructed so that new, positive, cultural linkages can be forged. The HIV/AIDS pandemic has served as a flashpoint to either highlight the importance or deny the relevance of theories and models while at the same time addressing the importance of culture in the development and implementation of communication programs.

  12. Scaling theory of depinning in the Sneppen model

    International Nuclear Information System (INIS)

    Maslov, S.; Paczuski, M.

    1994-01-01

    We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. 69, 3539 (1992)]. This theory is based on a ''gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, ν parallel (d) and ν perpendicular (d), characterizing the divergence of the parallel and perpendicular correlation lengths as the interface approaches its dynamical attractor

  13. PARFUME Theory and Model basis Report

    Energy Technology Data Exchange (ETDEWEB)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  14. A practitioner's guide to persuasion: an overview of 15 selected persuasion theories, models and frameworks.

    Science.gov (United States)

    Cameron, Kenzie A

    2009-03-01

    To provide a brief overview of 15 selected persuasion theories and models, and to present examples of their use in health communication research. The theories are categorized as message effects models, attitude-behavior approaches, cognitive processing theories and models, consistency theories, inoculation theory, and functional approaches. As it is often the intent of a practitioner to shape, reinforce, or change a patient's behavior, familiarity with theories of persuasion may lead to the development of novel communication approaches with existing patients. This article serves as an introductory primer to theories of persuasion with applications to health communication research. Understanding key constructs and general formulations of persuasive theories may allow practitioners to employ useful theoretical frameworks when interacting with patients.

  15. Prospects for advanced RF theory and modeling

    International Nuclear Information System (INIS)

    Batchelor, D. B.

    1999-01-01

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed. (c) 1999 American Institute of Physics

  16. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    International Nuclear Information System (INIS)

    Saraswati, Teguh Endah; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH 3 ). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory. (paper)

  17. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    Science.gov (United States)

    Endah Saraswati, Teguh; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH3). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory.

  18. Analysis of family-wise error rates in statistical parametric mapping using random field theory.

    Science.gov (United States)

    Flandin, Guillaume; Friston, Karl J

    2017-11-01

    This technical report revisits the analysis of family-wise error rates in statistical parametric mapping-using random field theory-reported in (Eklund et al. []: arXiv 1511.01863). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions-and random field theory-in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al. []: arXiv 1511.01863) for parametric procedures. Hum Brain Mapp, 2017. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  19. Phase Structure Of Fuzzy Field Theories And Multi trace Matrix Models

    International Nuclear Information System (INIS)

    Tekel, J.

    2015-01-01

    We review the interplay of fuzzy field theories and matrix models, with an emphasis on the phase structure of fuzzy scalar field theories. We give a self-contained introduction to these topics and give the details concerning the saddle point approach for the usual single trace and multi trace matrix models. We then review the attempts to explain the phase structure of the fuzzy field theory using a corresponding random matrix ensemble, showing the strength and weaknesses of this approach. We conclude with a list of challenges one needs to overcome and the most interesting open problems one can try to solve. (author)

  20. A Dynamic Systems Theory Model of Visual Perception Development

    Science.gov (United States)

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  1. A note on minimum-variance theory and beyond

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianfeng [Department of Informatics, Sussex University, Brighton, BN1 9QH (United Kingdom); Tartaglia, Giangaetano [Physics Department, Rome University ' La Sapienza' , Rome 00185 (Italy); Tirozzi, Brunello [Physics Department, Rome University ' La Sapienza' , Rome 00185 (Italy)

    2004-04-30

    We revisit the minimum-variance theory proposed by Harris and Wolpert (1998 Nature 394 780-4), discuss the implications of the theory on modelling the firing patterns of single neurons and analytically find the optimal control signals, trajectories and velocities. Under the rate coding assumption, input control signals employed in the minimum-variance theory should be Fitts processes rather than Poisson processes. Only if information is coded by interspike intervals, Poisson processes are in agreement with the inputs employed in the minimum-variance theory. For the integrate-and-fire model with Fitts process inputs, interspike intervals of efferent spike trains are very irregular. We introduce diffusion approximations to approximate neural models with renewal process inputs and present theoretical results on calculating moments of interspike intervals of the integrate-and-fire model. Results in Feng, et al (2002 J. Phys. A: Math. Gen. 35 7287-304) are generalized. In conclusion, we present a complete picture on the minimum-variance theory ranging from input control signals, to model outputs, and to its implications on modelling firing patterns of single neurons.

  2. A note on minimum-variance theory and beyond

    International Nuclear Information System (INIS)

    Feng Jianfeng; Tartaglia, Giangaetano; Tirozzi, Brunello

    2004-01-01

    We revisit the minimum-variance theory proposed by Harris and Wolpert (1998 Nature 394 780-4), discuss the implications of the theory on modelling the firing patterns of single neurons and analytically find the optimal control signals, trajectories and velocities. Under the rate coding assumption, input control signals employed in the minimum-variance theory should be Fitts processes rather than Poisson processes. Only if information is coded by interspike intervals, Poisson processes are in agreement with the inputs employed in the minimum-variance theory. For the integrate-and-fire model with Fitts process inputs, interspike intervals of efferent spike trains are very irregular. We introduce diffusion approximations to approximate neural models with renewal process inputs and present theoretical results on calculating moments of interspike intervals of the integrate-and-fire model. Results in Feng, et al (2002 J. Phys. A: Math. Gen. 35 7287-304) are generalized. In conclusion, we present a complete picture on the minimum-variance theory ranging from input control signals, to model outputs, and to its implications on modelling firing patterns of single neurons

  3. Linear radial pulsation theory. Lecture 5

    International Nuclear Information System (INIS)

    Cox, A.N.

    1983-01-01

    We describe a method for getting an equilibrium stellar envelope model using as input the total mass, the envelope mass, the surface effective temperature, the total surface luminosity, and the composition of the envelope. Then wih the structure of the envelope model known, we present a method for obtaining the raidal pulsation periods and growth rates for low order modes. The large amplitude pulsations observed for the yellow and red giants and supergiants are always these radial models, but for the stars nearer the main sequence, as for all of our stars and for the white dwarfs, there frequently are nonradial modes occuring also. Application of linear theory radial pulsation theory is made to the giant star sigma Scuti variables, while the linear nonradial theory will be used for the B stars in later lectures

  4. Johnson-Laird's mental models theory and its principles: an application with cell mental models of high school students

    OpenAIRE

    Mª Luz Rodríguez Palmero; Javier Marrero Acosta; Marco Antonio Moreira

    2001-01-01

    Following a discussion of Johnson-Laird's mental models theory, we report a study regarding high school students mental representations of cell, understood as mental models. Research findings suggest the appropriatedness of such a theory as a framework to interpret students' representations.

  5. Causal Agency Theory: Reconceptualizing a Functional Model of Self-Determination

    Science.gov (United States)

    Shogren, Karrie A.; Wehmeyer, Michael L.; Palmer, Susan B.; Forber-Pratt, Anjali J.; Little, Todd J.; Lopez, Shane

    2015-01-01

    This paper introduces Causal Agency Theory, an extension of the functional model of self-determination. Causal Agency Theory addresses the need for interventions and assessments pertaining to selfdetermination for all students and incorporates the significant advances in understanding of disability and in the field of positive psychology since the…

  6. A general-model-space diagrammatic perturbation theory

    International Nuclear Information System (INIS)

    Hose, G.; Kaldor, U.

    1980-01-01

    A diagrammatic many-body perturbation theory applicable to arbitrary model spaces is presented. The necessity of having a complete model space (all possible occupancies of the partially-filled shells) is avoided. This requirement may be troublesome for systems with several well-spaced open shells, such as most atomic and molecular excited states, as a complete model space spans a very broad energy range and leaves out states within that range, leading to poor or no convergence of the perturbation series. The method presented here would be particularly useful for such states. The solution of a model problem (He 2 excited Σ + sub(g) states) is demonstrated. (Auth.)

  7. Plane symmetric cosmological micro model in modified theory of Einstein’s general relativity

    Directory of Open Access Journals (Sweden)

    Panigrahi U.K.

    2003-01-01

    Full Text Available In this paper, we have investigated an anisotropic homogeneous plane symmetric cosmological micro-model in the presence of massless scalar field in modified theory of Einstein's general relativity. Some interesting physical and geometrical aspects of the model together with singularity in the model are discussed. Further, it is shown that this theory is valid and leads to Ein­stein's theory as the coupling parameter λ →>• 0 in micro (i.e. quantum level in general.

  8. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  9. Nonlinear structural mechanics theory, dynamical phenomena and modeling

    CERN Document Server

    Lacarbonara, Walter

    2013-01-01

    Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...

  10. Characteristics of highly rated leadership in nursing homes using item response theory.

    Science.gov (United States)

    Backman, Annica; Sjögren, Karin; Lindkvist, Marie; Lövheim, Hugo; Edvardsson, David

    2017-12-01

    To identify characteristics of highly rated leadership in nursing homes. An ageing population entails fundamental social, economic and organizational challenges for future aged care. Knowledge is limited of both specific leadership behaviours and organizational and managerial characteristics which have an impact on the leadership of contemporary nursing home care. Cross-sectional. From 290 municipalities, 60 were randomly selected and 35 agreed to participate, providing a sample of 3605 direct-care staff employed in 169 Swedish nursing homes. The staff assessed their managers' (n = 191) leadership behaviours using the Leadership Behaviour Questionnaire. Data were collected from November 2013 - September 2014, and the study was completed in November 2016. A two-parameter item response theory approach and regression analyses were used to identify specific characteristics of highly rated leadership. Five specific behaviours of highly rated nursing home leadership were identified; that the manager: experiments with new ideas; controls work closely; relies on subordinates; coaches and gives direct feedback; and handles conflicts constructively. The regression analyses revealed that managers with social work backgrounds and privately run homes were significantly associated with higher leadership ratings. This study highlights the five most important leadership behaviours that characterize those nursing home managers rated highest in terms of leadership. Managers in privately run nursing homes and managers with social work backgrounds were associated with higher leadership ratings. Further work is needed to explore these behaviours and factors predictive of higher leadership ratings. © 2017 John Wiley & Sons Ltd.

  11. Towards the hot sphaleron rate and sizable CP violation in the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Canseco, Andres

    2009-10-14

    In this work we study two aspects of the Standard Model related to baryogenesis at the electroweak scale. The first deals with CP violation. For some time now, it has been thought that CP violation within the Standard Model was too weak to be able to produce the baryon asymmetry of the universe. The argument is based on the small value of the Jarslkog's determinant, {proportional_to}10{sup -19}, but the latter is a perturbative calculation and CP violation in experiments can be much larger, e.g. in the Kaon system of order 10{sup -3}. With the use of the worldline method, we derive a oneloop effective action by integrating out the fermions in the next-to-leading order of a gradient expansion. The CP violation, previously present in the fermion sector, manifests as CP violating operators in the effective action. By treating the fermion masses non-perturbatively, albeit with their derivatives treated perturbatively as befits a gradient expansion, we find the operators not to be suppressed by the Jarlskog determinant, but by the Jarlskog invariant, which is of order 10{sup -5}. The second part of this work deals with the infrared analysis of Boedeker's effective theory, which encodes the dynamics of weakly coupled, non-abelian gauge fields at high temperature with characteristic momentum scale of order vertical stroke k vertical stroke {proportional_to}g{sup 2}T. The motivation for this is the eventual analytic calculation of the hot sphaleron rate, which is directly proportional to the rate of baryon number violation in the symmetric phase. After transcribing Boedeker's effective theory from a Langevin equation into an Euclidean path integral, we derive Dyson-Schwinger equations. We introduce an ansatz intended to solve the infrared dominated equations, and find the expected enhanced gauge propagator. An analogous role to the ghost propagator in Yang-Mills theory is played by the mixed propagator, which is suppressed. (orig.)

  12. Quantum integrable models of field theory

    International Nuclear Information System (INIS)

    Faddeev, L.D.

    1979-01-01

    Fundamental features of the classical method of the inverse problem have been formulated in the form which is convenient for its quantum reformulation. Typical examples are studied which may help to formulate the quantum method of the inverse problem. Examples are considered for interaction with both attraction and repulsion at a final density. The sine-Gordon model and the XYZ model from the quantum theory of magnetics are examined in short. It is noted that all the achievements of the one-dimensional mathematical physics as applied to exactly solvable quantum models may be put to an extent within the framework of the quantum method of the inverse problem. Unsolved questions are enumerated and perspectives of applying the inverse problem method are shown

  13. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.

    1986-01-01

    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions....

  14. A conceptual framework for organismal biology: linking theories, models, and data.

    Science.gov (United States)

    Zamer, William E; Scheiner, Samuel M

    2014-11-01

    Implicit or subconscious theory is especially common in the biological sciences. Yet, theory plays a variety of roles in scientific inquiry. First and foremost, it determines what does and does not count as a valid or interesting question or line of inquiry. Second, theory determines the background assumptions within which inquiries are pursued. Third, theory provides linkages among disciplines. For these reasons, it is important and useful to develop explicit theories for biology. A general theory of organisms is developed, which includes 10 fundamental principles that apply to all organisms, and 6 that apply to multicellular organisms only. The value of a general theory comes from its utility to help guide the development of more specific theories and models. That process is demonstrated by examining two domains: ecoimmunology and development. For the former, a constitutive theory of ecoimmunology is presented, and used to develop a specific model that explains energetic trade-offs that may result from an immunological response of a host to a pathogen. For the latter, some of the issues involved in trying to devise a constitutive theory that covers all of development are explored, and a more narrow theory of phenotypic novelty is presented. By its very nature, little of a theory of organisms will be new. Rather, the theory presented here is a formal expression of nearly two centuries of conceptual advances and practice in research. Any theory is dynamic and subject to debate and change. Such debate will occur as part of the present, initial formulation, as the ideas presented here are refined. The very process of debating the form of the theory acts to clarify thinking. The overarching goal is to stimulate debate about the role of theory in the study of organisms, and thereby advance our understanding of them. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology 2014. This work is written by US Government employees

  15. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  16. Generalization of the Activated Complex Theory of Reaction Rates. I. Quantum Mechanical Treatment

    Science.gov (United States)

    Marcus, R. A.

    1964-01-01

    In its usual form activated complex theory assumes a quasi-equilibrium between reactants and activated complex, a separable reaction coordinate, a Cartesian reaction coordinate, and an absence of interaction of rotation with internal motion in the complex. In the present paper a rate expression is derived without introducing the Cartesian assumption. The expression bears a formal resemblance to the usual one and reduces to it when the added assumptions of the latter are introduced.

  17. Group theory for unified model building

    International Nuclear Information System (INIS)

    Slansky, R.

    1981-01-01

    The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)

  18. Adult Attachment Ratings (AAR): an item response theory analysis.

    Science.gov (United States)

    Pilkonis, Paul A; Kim, Yookyung; Yu, Lan; Morse, Jennifer Q

    2014-01-01

    The Adult Attachment Ratings (AAR) include 3 scales for anxious, ambivalent attachment (excessive dependency, interpersonal ambivalence, and compulsive care-giving), 3 for avoidant attachment (rigid self-control, defensive separation, and emotional detachment), and 1 for secure attachment. The scales include items (ranging from 6-16 in their original form) scored by raters using a 3-point format (0 = absent, 1 = present, and 2 = strongly present) and summed to produce a total score. Item response theory (IRT) analyses were conducted with data from 414 participants recruited from psychiatric outpatient, medical, and community settings to identify the most informative items from each scale. The IRT results allowed us to shorten the scales to 5-item versions that are more precise and easier to rate because of their brevity. In general, the effective range of measurement for the scales was 0 to +2 SDs for each of the attachment constructs; that is, from average to high levels of attachment problems. Evidence for convergent and discriminant validity of the scales was investigated by comparing them with the Experiences of Close Relationships-Revised (ECR-R) scale and the Kobak Attachment Q-sort. The best consensus among self-reports on the ECR-R, informant ratings on the ECR-R, and expert judgments on the Q-sort and the AAR emerged for anxious, ambivalent attachment. Given the good psychometric characteristics of the scale for secure attachment, however, this measure alone might provide a simple alternative to more elaborate procedures for some measurement purposes. Conversion tables are provided for the 7 scales to facilitate transformation from raw scores to IRT-calibrated (theta) scores.

  19. A note on the theory of fast money flow dynamics

    Science.gov (United States)

    Sokolov, A.; Kieu, T.; Melatos, A.

    2010-08-01

    The gauge theory of arbitrage was introduced by Ilinski in [K. Ilinski, preprint arXiv:hep-th/9710148 (1997)] and applied to fast money flows in [A. Ilinskaia, K. Ilinski, preprint arXiv:cond-mat/9902044 (1999); K. Ilinski, Physics of finance: gauge modelling in non-equilibrium pricing (Wiley, 2001)]. The theory of fast money flow dynamics attempts to model the evolution of currency exchange rates and stock prices on short, e.g. intra-day, time scales. It has been used to explain some of the heuristic trading rules, known as technical analysis, that are used by professional traders in the equity and foreign exchange markets. A critique of some of the underlying assumptions of the gauge theory of arbitrage was presented by Sornette in [D. Sornette, Int. J. Mod. Phys. C 9, 505 (1998)]. In this paper, we present a critique of the theory of fast money flow dynamics, which was not examined by Sornette. We demonstrate that the choice of the input parameters used in [K. Ilinski, Physics of finance: gauge modelling in non-equilibrium pricing (Wiley, 2001)] results in sinusoidal oscillations of the exchange rate, in conflict with the results presented in [K. Ilinski, Physics of finance: gauge modelling in non-equilibrium pricing (Wiley, 2001)]. We also find that the dynamics predicted by the theory are generally unstable in most realistic situations, with the exchange rate tending to zero or infinity exponentially.

  20. δ expansion for local gauge theories. I. A one-dimensional model

    International Nuclear Information System (INIS)

    Bender, C.M.; Cooper, F.; Milton, K.A.; Moshe, M.; Pinsky, S.S.; Simmons, L.M. Jr.

    1992-01-01

    The principles of the δ perturbation theory were first proposed in the context of self-interacting scalar quantum field theory. There it was shown how to expand a (φ 2 ) 1+δ theory as a series in powers of δ and how to recover nonperturbative information about a φ 4 field theory from the δ expansion at δ=1. The purpose of this series of papers is to extend the notions of δ perturbation theory from boson theories to theories having a local gauge symmetry. In the case of quantum electrodynamics one introduces the parameter δ by generalizing the minimal coupling terms to bar ψ(∂-ieA) δ ψ and expanding in powers of δ. This interaction preserves local gauge invariance for all δ. While there are enormous benefits in using the δ expansion (obtaining nonperturbative results), gauge theories present new technical difficulties not encountered in self-interacting boson theories because the expression (∂-ieA) δ contains a derivative operator. In the first paper of this series a one-dimensional model whose interaction term has the form bar ψ[d/dt-igφ(t)] δ ψ is considered. The virtue of this model is that it provides a laboratory in which to study fractional powers of derivative operators without the added complexity of γ matrices. In the next paper of this series we consider two-dimensional electrodynamics and show how to calculate the anomaly in the δ expansion

  1. Further Results on Dynamic Additive Hazard Rate Model

    Directory of Open Access Journals (Sweden)

    Zhengcheng Zhang

    2014-01-01

    Full Text Available In the past, the proportional and additive hazard rate models have been investigated in the works. Nanda and Das (2011 introduced and studied the dynamic proportional (reversed hazard rate model. In this paper we study the dynamic additive hazard rate model, and investigate its aging properties for different aging classes. The closure of the model under some stochastic orders has also been investigated. Some examples are also given to illustrate different aging properties and stochastic comparisons of the model.

  2. Matrix models from localization of five-dimensional supersymmetric noncommutative U(1) gauge theory

    International Nuclear Information System (INIS)

    Lee, Bum-Hoon; Ro, Daeho; Yang, Hyun Seok

    2017-01-01

    We study localization of five-dimensional supersymmetric U(1) gauge theory on S 3 ×ℝ θ 2 where ℝ θ 2 is a noncommutative (NC) plane. The theory can be isomorphically mapped to three-dimensional supersymmetric U(N→∞) gauge theory on S 3 using the matrix representation on a separable Hilbert space on which NC fields linearly act. Therefore the NC space ℝ θ 2 allows for a flexible path to derive matrix models via localization from a higher-dimensional supersymmetric NC U(1) gauge theory. The result shows a rich duality between NC U(1) gauge theories and large N matrix models in various dimensions.

  3. Toward a General Theory for Multiphase Turbulence Part I: Development and Gauging of the Model Equations

    Energy Technology Data Exchange (ETDEWEB)

    B. A. Kashiwa; W. B. VanderHeyden

    2000-12-01

    A formalism for developing multiphase turbulence models is introduced by analogy to the phenomenological method used for single-phase turbulence. A sample model developed using the formalism is given in detail. The procedure begins with ensemble averaging of the exact conservation equations, with closure accomplished by using a combination of analytical and experimental results from the literature. The resulting model is applicable to a wide range of common multiphase flows including gas-solid, liquid-solid and gas-liquid (bubbly) flows. The model is positioned for ready extension to three-phase turbulence, or for use in two-phase turbulence in which one phase is accounted for in multiple size classes, representing polydispersivity. The formalism is expected to suggest directions toward a more fundamentally based theory, similar to the way that early work in single-phase turbulence has led to the spectral theory. The approach is unique in that a portion of the total energy decay rate is ascribed to each phase, as is dictated by the exact averaged equations, and results in a transport equation for energy decay rate associated with each phase. What follows is a straightforward definition of a turbulent viscosity for each phase, and accounts for the effect of exchange of fluctuational energy among phases on the turbulent shear viscosity. The model also accounts for the effect of slip momentum transfer among the phases on the production of turbulence kinetic energy and on the tensor character of the Reynolds stress. Collisional effects, when appropriate, are included by superposition. The model reduces to a standard form in limit of a single, pure material, and is expected to do a credible job of describing multiphase turbulent flows in a wide variety of regimes using a single set of coefficients.

  4. Realization of a scenario with two relaxation rates in the Hubbard Falicov-Kimball model

    Science.gov (United States)

    Barman, H.; Laad, M. S.; Hassan, S. R.

    2018-02-01

    A single transport relaxation rate governs the decay of both longitudinal and Hall currents in Landau Fermi liquids (FL). Breakdown of this fundamental feature, first observed in two-dimensional cuprates and subsequently in other three-dimensional correlated systems close to the Mott metal-insulator transition, played a pivotal role in emergence of a non-FL (NFL) paradigm in higher dimensions D (>1 ) . Motivated hereby, we explore the emergence of this "two relaxation rates" scenario in the Hubbard Falicov-Kimball model (HFKM) using the dynamical mean-field theory (DMFT). Specializing to D =3 , we find, beyond a critical Falicov-Kimball (FK) interaction, that two distinct relaxation rates governing distinct temperature (T ) dependence of the longitudinal and Hall currents naturally emerges in the NFL metal. Our results show good accord with the experiment in V2 -yO3 near the metal-to-insulator transition (MIT). We rationalize this surprising finding by an analytical analysis of the structure of charge and spin Hamiltonians in the underlying impurity problem, specifically through a bosonization method applied to the Wolff model and connecting it to the x-ray edge problem.

  5. Extended inflation from higher-dimensional theories

    International Nuclear Information System (INIS)

    Holman, R.; Kolb, E.W.; Vadas, S.L.; Wang, Y.

    1991-01-01

    We consider the possibility that higher-dimensional theories may, upon reduction to four dimensions, allow extended inflation to occur. We analyze two separate models. One is a very simple toy model consisting of higher-dimensional gravity coupled to a scalar field whose potential allows for a first-order phase transition. The other is a more sophisticated model incorporating the effects of nontrivial field configurations (monopole, Casimir, and fermion bilinear condensate effects) that yield a nontrivial potential for the radius of the internal space. We find that extended inflation does not occur in these models. We also find that the bubble nucleation rate in these theories is time dependent unlike the case in the original version of extended inflation

  6. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  7. Multilevel Higher-Order Item Response Theory Models

    Science.gov (United States)

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  8. Ideal flow theory for the double - shearing model as a basis for metal forming design

    Science.gov (United States)

    Alexandrov, S.; Trung, N. T.

    2018-02-01

    In the case of Tresca’ solids (i.e. solids obeying the Tresca yield criterion and its associated flow rule) ideal flows have been defined elsewhere as solenoidal smooth deformations in which an eigenvector field associated everywhere with the greatest principal stress (and strain rate) is fixed in the material. Under such conditions all material elements undergo paths of minimum plastic work, a condition which is often advantageous for metal forming processes. Therefore, the ideal flow theory is used as the basis of a procedure for the preliminary design of such processes. The present paper extends the theory of stationary planar ideal flow to pressure dependent materials obeying the double shearing model and the double slip and rotation model. It is shown that the original problem of plasticity reduces to a purely geometric problem. The corresponding system of equations is hyperbolic. The characteristic relations are integrated in elementary functions. In regions where one family of characteristics is straight, mapping between the principal lines and Cartesian coordinates is determined by linear ordinary differential equations. An illustrative example is provided.

  9. Advances in cognitive theory and therapy: the generic cognitive model.

    Science.gov (United States)

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  10. Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.

    Science.gov (United States)

    McPhail, Beverly A

    2016-07-01

    The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy. © The Author(s) 2015.

  11. Conformal field theories, Coulomb gas picture and integrable models

    International Nuclear Information System (INIS)

    Zuber, J.B.

    1988-01-01

    The aim of the study is to present the links between some results of conformal field theory, the conventional Coulomb gas picture in statistical mechanics and the approach of integrable models. It is shown that families of conformal theories, related by the coset construction to the SU(2) Kac-Moody algebra, may be regarded as obtained from some free field, and modified by the coupling of its winding numbers to floating charges. This representation reflects the procedure of restriction of the corresponding integrable lattice models. The work may be generalized to models based on the coset construction with higher rank algebras. The corresponding integrable models are identified. In the conformal field description, generalized parafermions appear, and are coupled to free fields living on a higher-dimensional torus. The analysis is not as exhaustive as in the SU(2) case: all the various restrictions have not been identified, nor the modular invariants completely classified

  12. A game theory-based trust measurement model for social networks.

    Science.gov (United States)

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  13. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection

    Science.gov (United States)

    Reason, Robert D.; Kimball, Ezekiel W.

    2012-01-01

    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  14. Modeling and Theories of Pathophysiology and Physiology of the Basal Ganglia–Thalamic–Cortical System: Critical Analysis

    Science.gov (United States)

    Montgomery Jr., Erwin B.

    2016-01-01

    Theories impact the movement disorders clinic, not only affecting the development of new therapies but determining how current therapies are used. Models are theories that are procedural rather than declarative. Theories and models are important because, as argued by Kant, one cannot know the thing-in-itself (das Ding an sich) and only a model is knowable. Further, biological variability forces higher level abstraction relevant for all variants. It is that abstraction that is raison d’être of theories and models. Theories “connect the dots” to move from correlation to causation. The necessity of theory makes theories helpful or counterproductive. Theories and models of the pathophysiology and physiology of the basal ganglia–thalamic–cortical system do not spontaneously arise but have a history and consequently are legacies. Over the last 40 years, numerous theories and models of the basal ganglia have been proposed only to be forgotten or dismissed, rarely critiqued. It is not harsh to say that current popular theories positing increased neuronal activities in the Globus Pallidus Interna (GPi), excessive beta oscillations and increased synchronization not only fail to provide an adequate explication but are inconsistent with many observations. It is likely that their shared intellectual and epistemic inheritance plays a factor in their shared failures. These issues are critically examined. How one is to derive theories and models and have hope these will be better is explored as well. PMID:27708569

  15. Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons.

    Science.gov (United States)

    Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang

    2017-07-06

    Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.

  16. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  17. Developing an Asteroid Rotational Theory

    Science.gov (United States)

    Geis, Gena; Williams, Miguel; Linder, Tyler; Pakey, Donald

    2018-01-01

    The goal of this project is to develop a theoretical asteroid rotational theory from first principles. Starting at first principles provides a firm foundation for computer simulations which can be used to analyze multiple variables at once such as size, rotation period, tensile strength, and density. The initial theory will be presented along with early models of applying the theory to the asteroid population. Early results confirm previous work by Pravec et al. (2002) that show the majority of the asteroids larger than 200m have negligible tensile strength and have spin rates close to their critical breakup point. Additionally, results show that an object with zero tensile strength has a maximum rotational rate determined by the object’s density, not size. Therefore, an iron asteroid with a density of 8000 kg/m^3 would have a minimum spin period of 1.16h if the only forces were gravitational and centrifugal. The short-term goal is to include material forces in the simulations to determine what tensile strength will allow the high spin rates of asteroids smaller than 150m.

  18. Two problems from the theory of semiotic control models. I. Representations of semiotic models

    Energy Technology Data Exchange (ETDEWEB)

    Osipov, G S

    1981-11-01

    Two problems from the theory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of themtheory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of them. Algebraic representation of semiotic models, covering of representations, their reduction and equivalence are discussed. The interrelations between functional and structural characteristics of semiotic models are investigated. 20 references.

  19. Functional techniques in quantum field theory and two-dimensional models

    International Nuclear Information System (INIS)

    Souza, C. Farina de.

    1985-03-01

    Functional methods applied to Quantum Field Theory are studied. It is shown how to construct the Generating Functional using three of the most important methods existent in the literature, due to Feynman, Symanzik and Schwinger. The Axial Anomaly is discussed in the usual way, and a non perturbative method due to Fujikawa to obtain this anomaly in the path integral formalism is presented. The ''Roskies-Shaposnik-Fujikawa's method'', which makes use of Fujikawa's original idea to solve bidimensional models, is introduced in the Schwinger's model, which, in turn, is applied to obtain the exact solution of the axial model. It is discussed briefly how different regularization procedures can affect the theory in question. (author)

  20. Mean field theory of nuclei and shell model. Present status and future outlook

    International Nuclear Information System (INIS)

    Nakada, Hitoshi

    2003-01-01

    Many of the recent topics of the nuclear structure are concerned on the problems of unstable nuclei. It has been revealed experimentally that the nuclear halos and the neutron skins as well as the cluster structures or the molecule-like structures can be present in the unstable nuclei, and the magic numbers well established in the stable nuclei disappear occasionally while new ones appear. The shell model based on the mean field approximation has been successfully applied to stable nuclei to explain the nuclear structure as the finite many body system quantitatively and it is considered as the standard model at present. If the unstable nuclei will be understood on the same model basis or not is a matter related to fundamental principle of nuclear structure theories. In this lecture, the fundamental concept and the framework of the theory of nuclear structure based on the mean field theory and the shell model are presented to make clear the problems and to suggest directions for future researches. At first fundamental properties of nuclei are described under the subtitles: saturation and magic numbers, nuclear force and effective interactions, nuclear matter, and LS splitting. Then the mean field theory is presented under subtitles: the potential model, the mean field theory, Hartree-Fock approximation for nuclear matter, density dependent force, semiclassical mean field theory, mean field theory and symmetry, Skyrme interaction and density functional, density matrix expansion, finite range interactions, effective masses, and motion of center of mass. The subsequent section is devoted to the shell model with the subtitles: beyond the mean field approximation, core polarization, effective interaction of shell model, one-particle wave function, nuclear deformation and shell model, and shell model of cross shell. Finally structure of unstable nuclei is discussed with the subtitles: general remark on the study of unstable nuclear structure, asymptotic behavior of wave

  1. Symmetry-guided large-scale shell-model theory

    Czech Academy of Sciences Publication Activity Database

    Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.

    2016-01-01

    Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell -model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016

  2. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  3. Multi-Agent Market Modeling of Foreign Exchange Rates

    Science.gov (United States)

    Zimmermann, Georg; Neuneier, Ralph; Grothmann, Ralph

    A market mechanism is basically driven by a superposition of decisions of many agents optimizing their profit. The oeconomic price dynamic is a consequence of the cumulated excess demand/supply created on this micro level. The behavior analysis of a small number of agents is well understood through the game theory. In case of a large number of agents one may use the limiting case that an individual agent does not have an influence on the market, which allows the aggregation of agents by statistic methods. In contrast to this restriction, we can omit the assumption of an atomic market structure, if we model the market through a multi-agent approach. The contribution of the mathematical theory of neural networks to the market price formation is mostly seen on the econometric side: neural networks allow the fitting of high dimensional nonlinear dynamic models. Furthermore, in our opinion, there is a close relationship between economics and the modeling ability of neural networks because a neuron can be interpreted as a simple model of decision making. With this in mind, a neural network models the interaction of many decisions and, hence, can be interpreted as the price formation mechanism of a market.

  4. Growing up and Role Modeling: A Theory in Iranian Nursing Students? Education

    OpenAIRE

    Nouri, Jamileh Mokhtari; Ebadi, Abbas; Alhani, Fatemeh; Rejeh, Nahid

    2014-01-01

    One of the key strategies in students? learning is being affected by models. Understanding the role-modeling process in education will help to make greater use of this training strategy. The aim of this grounded theory study was to explore Iranian nursing students and instructors? experiences about role modeling process. Data was analyzed by Glaserian?s Grounded Theory methodology through semi-structured interviews with 7 faculty members, 2 nursing students; the three focus group discussions ...

  5. Consistent constraints on the Standard Model Effective Field Theory

    International Nuclear Information System (INIS)

    Berthier, Laure; Trott, Michael

    2016-01-01

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.

  6. Modeling Real Exchange Rate Persistence in Chile

    Directory of Open Access Journals (Sweden)

    Leonardo Salazar

    2017-07-01

    Full Text Available The long and persistent swings in the real exchange rate have for a long time puzzled economists. Recent models built on imperfect knowledge economics seem to provide a theoretical explanation for this persistence. Empirical results, based on a cointegrated vector autoregressive (CVAR model, provide evidence of error-increasing behavior in prices and interest rates, which is consistent with the persistence observed in the data. The movements in the real exchange rate are compensated by movements in the interest rate spread, which restores the equilibrium in the product market when the real exchange rate moves away from its long-run benchmark value. Fluctuations in the copper price also explain the deviations of the real exchange rate from its long-run equilibrium value.

  7. A model of high-rate indentation of a cylindrical striking pin into a deformable body

    Science.gov (United States)

    Zalazinskaya, E. A.; Zalazinsky, A. G.

    2017-12-01

    Mathematical modeling of an impact and high-rate indentation to a significant depth of a flat-faced hard cylindrical striking pin into a massive deformable target body is carried out. With the application of the kinematic extreme theorem of the plasticity theory and the kinetic energy variation theorem, the phase trajectories of the striking pin are calculated, the initial velocity of the striking pin in the body, the limit values of the inlet duct length, and the depth of striking pin penetration into the target are determined.

  8. 2 + 1 quantum gravity as a toy model for the 3 + 1 theory

    International Nuclear Information System (INIS)

    Ashtekar, A.; Husain, V.; Smolin, L.; Samuel, J.; Utah Univ., Salt Lake City, UT

    1989-01-01

    2 + 1 Einstein gravity is used as a toy model for testing a program for non-perturbative canonical quantisation of the 3 + 1 theory. The program can be successfully implemented in the model and leads to a surprisingly rich quantum theory. (author)

  9. Deformed type 0A matrix model and super-Liouville theory for fermionic black holes

    International Nuclear Information System (INIS)

    Ahn, Changrim; Kim, Chanju; Park, Jaemo; Suyama, Takao; Yamamoto, Masayoshi

    2006-01-01

    We consider a c-circumflex = 1 model in the fermionic black hole background. For this purpose we consider a model which contains both the N 1 and the N = 2 super-Liouville interactions. We propose that this model is dual to a recently proposed type 0A matrix quantum mechanics model with vortex deformations. We support our conjecture by showing that non-perturbative corrections to the free energy computed by both the matrix model and the super-Liouville theories agree exactly by treating the N = 2 interaction as a small perturbation. We also show that a two-point function on sphere calculated from the deformed type 0A matrix model is consistent with that of the N = 2 super-Liouville theory when the N = 1 interaction becomes small. This duality between the matrix model and super-Liouville theories leads to a conjecture for arbitrary n-point correlation functions of the N = 1 super-Liouville theory on the sphere

  10. The Relationship between Tax Rate, Penalty Rate, Tax Fairness and Excise Duty Non-compliance.

    Directory of Open Access Journals (Sweden)

    Sinnasamy Perabavathi

    2017-01-01

    Full Text Available The rise of indirect tax non-compliance by taxpayers became the main concern of most of the tax authorities around the globe. In Malaysia, non complaince such as smuggling and illegal trade activities by importers involving cigarettes, liquor and imported vehicles bound under Excise Act 1976 have caused revenue losses in monetary and non-monetary aspects. Therefore, the objective of this study is to examine the relationship of tax rate, penalty rate and tax fairness of excise duty non-compliance. This study uses the Deterrence Theory as a basis theory to investigate the phenomenon of excise duty non complaince. A total of 500 excise duty offenders throughout Malaysia responded to the survey. The model was empirically tested by using Partial Least Squares (PLS with disproportionate stratified random sampling technique. The results indicated that the perception of tax rate and penalty rate are positively related while tax fairness is negatively related to excise duty non-compliance among importers.

  11. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  12. A quantitative quasispecies theory-based model of virus escape mutation under immune selection.

    Science.gov (United States)

    Woo, Hyung-June; Reifman, Jaques

    2012-08-07

    Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although fundamental aspects of such a balance of mutation and selection pressure have been established by the quasispecies theory decades ago, its implications have largely remained qualitative. Here, we present a quantitative approach to model the virus evolution under cytotoxic T-lymphocyte immune response. The virus quasispecies dynamics are explicitly represented by mutations in the combined sequence space of a set of epitopes within the viral genome. We stochastically simulated the growth of a viral population originating from a single wild-type founder virus and its recognition and clearance by the immune response, as well as the expansion of its genetic diversity. Applied to the immune escape of a simian immunodeficiency virus epitope, model predictions were quantitatively comparable to the experimental data. Within the model parameter space, we found two qualitatively different regimes of infectious disease pathogenesis, each representing alternative fates of the immune response: It can clear the infection in finite time or eventually be overwhelmed by viral growth and escape mutation. The latter regime exhibits the characteristic disease progression pattern of human immunodeficiency virus, while the former is bounded by maximum mutation rates that can be suppressed by the immune response. Our results demonstrate that, by explicitly representing epitope mutations and thus providing a genotype-phenotype map, the quasispecies theory can form the basis of a detailed sequence-specific model of real-world viral pathogens evolving under immune selection.

  13. Prospect Theory in the Heterogeneous Agent Model

    Czech Academy of Sciences Publication Activity Database

    Polach, J.; Kukačka, Jiří

    (2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economic s OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438.pdf

  14. Model building with a dynamical volume element in gravity, particle theory and theories of extended object

    International Nuclear Information System (INIS)

    Guendelman, E.

    2004-01-01

    Full Text:The Volume Element of Space Time can be considered as a geometrical object which can be independent of the metric. The use in the action of a volume element which is metric independent leads to the appearance of a measure of integration which is metric independent. This can be applied to all known generally coordinate invariant theories, we will discuss three very important cases: 1. 4-D theories describing gravity and matter fields, 2. Parametrization invariant theories of extended objects and 3. Higher dimensional theories including gravity and matter fields. In case 1, a large number of new effects appear: (i) spontaneous breaking of scale invariance associated to integration of degrees of freedom related to the measure, (ii) under normal particle physics laboratory conditions fermions split into three families, but when matter is highly diluted, neutrinos increase their mass and become suitable candidates for dark matter, (iii) cosmic coincidence between dark energy and dark matter is natural, (iv) quintessence scenarios with automatic decoupling of the quintessence scalar to ordinary matter, but not dark matter are obtained (2) For theories or extended objects, the use of a measure of integration independent of the metric leads to (i) dynamical tension, (ii) string models of non abelian confinement (iii) The possibility of new Weyl invariant light-like branes (WTT.L branes). These Will branes dynamically adjust themselves to sit at black hole horizons and in the context of higher dimensional theories can provide examples of massless 4-D particles with nontrivial Kaluza Klein quantum numbers, (3) In Bronx and Kaluza Klein scenarios, the use of a measure independent of the metric makes it possible to construct naturally models where only the extra dimensions get curved and the 4-D observable space-time remain flat

  15. Anomaly-free gauges in superstring theory and double supersymmetric sigma-model

    International Nuclear Information System (INIS)

    Demichev, A.P.; Iofa, M.Z.

    1991-01-01

    Superharmonic gauge which is a nontrivial analog of the harmonic gauge in bosonic string theory is constructed for the fermionic superstrings. In contrast to the conformal gauge, the harmonic gauge in bosonic string and superharmonic gauge in superstring theory are shown to be free from previously discovered BRST anomaly (in critical dimension) in higher orders of string perturbation theory and thus provide the setup for consistent quantization of (super)string theory. Superharmonic gauge appears to be closely connected with the supersymmetric σ-model with the target space being also a supermanifold. 28 refs

  16. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  17. Exact string theory model of closed timelike curves and cosmological singularities

    International Nuclear Information System (INIS)

    Johnson, Clifford V.; Svendsen, Harald G.

    2004-01-01

    We study an exact model of string theory propagating in a space-time containing regions with closed timelike curves (CTCs) separated from a finite cosmological region bounded by a big bang and a big crunch. The model is an nontrivial embedding of the Taub-NUT geometry into heterotic string theory with a full conformal field theory (CFT) definition, discovered over a decade ago as a heterotic coset model. Having a CFT definition makes this an excellent laboratory for the study of the stringy fate of CTCs, the Taub cosmology, and the Milne/Misner-type chronology horizon which separates them. In an effort to uncover the role of stringy corrections to such geometries, we calculate the complete set of α ' corrections to the geometry. We observe that the key features of Taub-NUT persist in the exact theory, together with the emergence of a region of space with Euclidean signature bounded by timelike curvature singularities. Although such remarks are premature, their persistence in the exact geometry is suggestive that string theory is able to make physical sense of the Milne/Misner singularities and the CTCs, despite their pathological character in general relativity. This may also support the possibility that CTCs may be viable in some physical situations, and may be a natural ingredient in pre-big bang cosmological scenarios

  18. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  19. Equivalence of interest rate models and lattice gases.

    Science.gov (United States)

    Pirjol, Dan

    2012-04-01

    We consider the class of short rate interest rate models for which the short rate is proportional to the exponential of a Gaussian Markov process x(t) in the terminal measure r(t)=a(t)exp[x(t)]. These models include the Black-Derman-Toy and Black-Karasinski models in the terminal measure. We show that such interest rate models are equivalent to lattice gases with attractive two-body interaction, V(t(1),t(2))=-Cov[x(t(1)),x(t(2))]. We consider in some detail the Black-Karasinski model with x(t) as an Ornstein-Uhlenbeck process, and show that it is similar to a lattice gas model considered by Kac and Helfand, with attractive long-range two-body interactions, V(x,y)=-α(e(-γ|x-y|)-e(-γ(x+y))). An explicit solution for the model is given as a sum over the states of the lattice gas, which is used to show that the model has a phase transition similar to that found previously in the Black-Derman-Toy model in the terminal measure.

  20. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  1. Background field method in gauge theories and on linear sigma models

    International Nuclear Information System (INIS)

    van de Ven, A.E.M.

    1986-01-01

    This dissertation constitutes a study of the ultraviolet behavior of gauge theories and two-dimensional nonlinear sigma-models by means of the background field method. After a general introduction in chapter 1, chapter 2 presents algorithms which generate the divergent terms in the effective action at one-loop for arbitrary quantum field theories in flat spacetime of dimension d ≤ 11. It is demonstrated that global N = 1 supersymmetric Yang-Mills theory in six dimensions in one-loop UV-finite. Chapter 3 presents an algorithm which produces the divergent terms in the effective action at two-loops for renormalizable quantum field theories in a curved four-dimensional background spacetime. Chapter 4 presents a study of the two-loop UV-behavior of two-dimensional bosonic and supersymmetric non-linear sigma-models which include a Wess-Zumino-Witten term. It is found that, to this order, supersymmetric models on quasi-Ricci flat spaces are UV-finite and the β-functions for the bosonic model depend only on torsionful curvatures. Chapter 5 summarizes a superspace calculation of the four-loop β-function for two-dimensional N = 1 and N = 2 supersymmetric non-linear sigma-models. It is found that besides the one-loop contribution which vanishes on Ricci-flat spaces, the β-function receives four-loop contributions which do not vanish in the Ricci-flat case. Implications for superstrings are discussed. Chapters 6 and 7 treat the details of these calculations

  2. Modeling Ignition of a Heptane Isomer: Improved Thermodynamics, Reaction Pathways, Kinetic, and Rate Rule Optimizations for 2-Methylhexane

    KAUST Repository

    Mohamed, Samah; Cai, Liming; Khaled, Fathi; Banyon, Colin; Wang, Zhandong; Rachidi, Mariam El; Pitsch, Heinz; Curran, Henry J.; Farooq, Aamir; Sarathy, Mani

    2016-01-01

    Accurate chemical kinetic combustion models of lightly branched alkanes (e.g., 2-methylalkanes) are important to investigate the combustion behavior of real fuels. Improving the fidelity of existing kinetic models is a necessity, as new experiments and advanced theories show inaccuracies in certain portions of the models. This study focuses on updating thermodynamic data and the kinetic reaction mechanism for a gasoline surrogate component, 2-methylhexane, based on recently published thermodynamic group values and rate rules derived from quantum calculations and experiments. Alternative pathways for the isomerization of peroxy-alkylhydroperoxide (OOQOOH) radicals are also investigated. The effects of these updates are compared against new high-pressure shock tube and rapid compression machine ignition delay measurements. It is shown that rate constant modifications are required to improve agreement between kinetic modeling simulations and experimental data. We further demonstrate the ability to optimize the kinetic model using both manual and automated techniques for rate parameter tunings to improve agreement with the measured ignition delay time data. Finally, additional low temperature chain branching reaction pathways are shown to improve the model’s performance. The present approach to model development provides better performance across extended operating conditions while also strengthening the fundamental basis of the model.

  3. Modeling Ignition of a Heptane Isomer: Improved Thermodynamics, Reaction Pathways, Kinetic, and Rate Rule Optimizations for 2-Methylhexane

    KAUST Repository

    Mohamed, Samah

    2016-03-21

    Accurate chemical kinetic combustion models of lightly branched alkanes (e.g., 2-methylalkanes) are important to investigate the combustion behavior of real fuels. Improving the fidelity of existing kinetic models is a necessity, as new experiments and advanced theories show inaccuracies in certain portions of the models. This study focuses on updating thermodynamic data and the kinetic reaction mechanism for a gasoline surrogate component, 2-methylhexane, based on recently published thermodynamic group values and rate rules derived from quantum calculations and experiments. Alternative pathways for the isomerization of peroxy-alkylhydroperoxide (OOQOOH) radicals are also investigated. The effects of these updates are compared against new high-pressure shock tube and rapid compression machine ignition delay measurements. It is shown that rate constant modifications are required to improve agreement between kinetic modeling simulations and experimental data. We further demonstrate the ability to optimize the kinetic model using both manual and automated techniques for rate parameter tunings to improve agreement with the measured ignition delay time data. Finally, additional low temperature chain branching reaction pathways are shown to improve the model’s performance. The present approach to model development provides better performance across extended operating conditions while also strengthening the fundamental basis of the model.

  4. The Five-Factor Model and Self-Determination Theory

    DEFF Research Database (Denmark)

    Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette

    This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...

  5. Perturbation theory around the Wess-Zumino-Witten model

    International Nuclear Information System (INIS)

    Hasseln, H. v.

    1991-05-01

    We consider a perturbation of the Wess-Zumino-Witten model in 2D by a current-current interaction. The β-function is computed to third order in the coupling constant and a nontrivial fixedpoint is found. By non-abelian bosonization, this perturbed WZW-model is shown to have the same β-function (at least to order g 2 ) as the fermionic theory with a four-fermion interaction. (orig.) [de

  6. Forecasting the mortality rates using Lee-Carter model and Heligman-Pollard model

    Science.gov (United States)

    Ibrahim, R. I.; Ngataman, N.; Abrisam, W. N. A. Wan Mohd

    2017-09-01

    Improvement in life expectancies has driven further declines in mortality. The sustained reduction in mortality rates and its systematic underestimation has been attracting the significant interest of researchers in recent years because of its potential impact on population size and structure, social security systems, and (from an actuarial perspective) the life insurance and pensions industry worldwide. Among all forecasting methods, the Lee-Carter model has been widely accepted by the actuarial community and Heligman-Pollard model has been widely used by researchers in modelling and forecasting future mortality. Therefore, this paper only focuses on Lee-Carter model and Heligman-Pollard model. The main objective of this paper is to investigate how accurately these two models will perform using Malaysian data. Since these models involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 8.0 (MATLAB 8.0) software will be used to estimate the parameters of the models. Autoregressive Integrated Moving Average (ARIMA) procedure is applied to acquire the forecasted parameters for both models as the forecasted mortality rates are obtained by using all the values of forecasted parameters. To investigate the accuracy of the estimation, the forecasted results will be compared against actual data of mortality rates. The results indicate that both models provide better results for male population. However, for the elderly female population, Heligman-Pollard model seems to underestimate to the mortality rates while Lee-Carter model seems to overestimate to the mortality rates.

  7. Models construction for acetone-butanol-ethanol fermentations with acetate/butyrate consecutively feeding by graph theory.

    Science.gov (United States)

    Li, Zhigang; Shi, Zhongping; Li, Xin

    2014-05-01

    Several fermentations with consecutively feeding of acetate/butyrate were conducted in a 7 L fermentor and the results indicated that exogenous acetate/butyrate enhanced solvents productivities by 47.1% and 39.2% respectively, and changed butyrate/acetate ratios greatly. Then extracellular butyrate/acetate ratios were utilized for calculation of acids rates and the results revealed that acetate and butyrate formation pathways were almost blocked by corresponding acids feeding. In addition, models for acetate/butyrate feeding fermentations were constructed by graph theory based on calculation results and relevant reports. Solvents concentrations and butanol/acetone ratios of these fermentations were also calculated and the results of models calculation matched fermentation data accurately which demonstrated that models were constructed in a reasonable way. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Theories and control models and motor learning: clinical applications in neuro-rehabilitation.

    Science.gov (United States)

    Cano-de-la-Cuerda, R; Molero-Sánchez, A; Carratalá-Tejada, M; Alguacil-Diego, I M; Molina-Rueda, F; Miangolarra-Page, J C; Torricelli, D

    2015-01-01

    In recent decades there has been a special interest in theories that could explain the regulation of motor control, and their applications. These theories are often based on models of brain function, philosophically reflecting different criteria on how movement is controlled by the brain, each being emphasised in different neural components of the movement. The concept of motor learning, regarded as the set of internal processes associated with practice and experience that produce relatively permanent changes in the ability to produce motor activities through a specific skill, is also relevant in the context of neuroscience. Thus, both motor control and learning are seen as key fields of study for health professionals in the field of neuro-rehabilitation. The major theories of motor control are described, which include, motor programming theory, systems theory, the theory of dynamic action, and the theory of parallel distributed processing, as well as the factors that influence motor learning and its applications in neuro-rehabilitation. At present there is no consensus on which theory or model defines the regulations to explain motor control. Theories of motor learning should be the basis for motor rehabilitation. The new research should apply the knowledge generated in the fields of control and motor learning in neuro-rehabilitation. Copyright © 2011 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.

  9. Extended inflation from higher dimensional theories

    International Nuclear Information System (INIS)

    Holman, R.; Kolb, E.W.; Vadas, S.L.; Wang, Yun.

    1990-04-01

    The possibility is considered that higher dimensional theories may, upon reduction to four dimensions, allow extended inflation to occur. Two separate models are analayzed. One is a very simple toy model consisting of higher dimensional gravity coupled to a scalar field whose potential allows for a first-order phase transition. The other is a more sophisticated model incorporating the effects of non-trivial field configurations (monopole, Casimir, and fermion bilinear condensate effects) that yield a non-trivial potential for the radius of the internal space. It was found that extended inflation does not occur in these models. It was also found that the bubble nucleation rate in these theories is time dependent unlike the case in the original version of extended inflation

  10. I can do that: the impact of implicit theories on leadership role model effectiveness.

    Science.gov (United States)

    Hoyt, Crystal L; Burnette, Jeni L; Innella, Audrey N

    2012-02-01

    This research investigates the role of implicit theories in influencing the effectiveness of successful role models in the leadership domain. Across two studies, the authors test the prediction that incremental theorists ("leaders are made") compared to entity theorists ("leaders are born") will respond more positively to being presented with a role model before undertaking a leadership task. In Study 1, measuring people's naturally occurring implicit theories of leadership, the authors showed that after being primed with a role model, incremental theorists reported greater leadership confidence and less anxious-depressed affect than entity theorists following the leadership task. In Study 2, the authors demonstrated the causal role of implicit theories by manipulating participants' theory of leadership ability. They replicated the findings from Study 1 and demonstrated that identification with the role model mediated the relationship between implicit theories and both confidence and affect. In addition, incremental theorists outperformed entity theorists on the leadership task.

  11. Theory for the three-dimensional Mercedes-Benz model of water

    Science.gov (United States)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.

    2009-11-01

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  12. Theory for the three-dimensional Mercedes-Benz model of water.

    Science.gov (United States)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A

    2009-11-21

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  13. Boltzmann, Darwin and Directionality theory

    Energy Technology Data Exchange (ETDEWEB)

    Demetrius, Lloyd A., E-mail: ldemetr@oeb.harvard.edu

    2013-09-01

    Boltzmann’s statistical thermodynamics is a mathematical theory which relates the macroscopic properties of aggregates of interacting molecules with the laws of their interaction. The theory is based on the concept thermodynamic entropy, a statistical measure of the extent to which energy is spread throughout macroscopic matter. Macroscopic evolution of material aggregates is quantitatively explained in terms of the principle: Thermodynamic entropy increases as the composition of the aggregate changes under molecular collision. Darwin’s theory of evolution is a qualitative theory of the origin of species and the adaptation of populations to their environment. A central concept in the theory is fitness, a qualitative measure of the capacity of an organism to contribute to the ancestry of future generations. Macroscopic evolution of populations of living organisms can be qualitatively explained in terms of a neo-Darwinian principle: Fitness increases as the composition of the population changes under variation and natural selection. Directionality theory is a quantitative model of the Darwinian argument of evolution by variation and selection. This mathematical theory is based on the concept evolutionary entropy, a statistical measure which describes the rate at which an organism appropriates energy from the environment and reinvests this energy into survivorship and reproduction. According to directionality theory, microevolutionary dynamics, that is evolution by mutation and natural selection, can be quantitatively explained in terms of a directionality principle: Evolutionary entropy increases when the resources are diverse and of constant abundance; but decreases when the resource is singular and of variable abundance. This report reviews the analytical and empirical support for directionality theory, and invokes the microevolutionary dynamics of variation and selection to delineate the principles which govern macroevolutionary dynamics of speciation and

  14. Boltzmann, Darwin and Directionality theory

    International Nuclear Information System (INIS)

    Demetrius, Lloyd A.

    2013-01-01

    Boltzmann’s statistical thermodynamics is a mathematical theory which relates the macroscopic properties of aggregates of interacting molecules with the laws of their interaction. The theory is based on the concept thermodynamic entropy, a statistical measure of the extent to which energy is spread throughout macroscopic matter. Macroscopic evolution of material aggregates is quantitatively explained in terms of the principle: Thermodynamic entropy increases as the composition of the aggregate changes under molecular collision. Darwin’s theory of evolution is a qualitative theory of the origin of species and the adaptation of populations to their environment. A central concept in the theory is fitness, a qualitative measure of the capacity of an organism to contribute to the ancestry of future generations. Macroscopic evolution of populations of living organisms can be qualitatively explained in terms of a neo-Darwinian principle: Fitness increases as the composition of the population changes under variation and natural selection. Directionality theory is a quantitative model of the Darwinian argument of evolution by variation and selection. This mathematical theory is based on the concept evolutionary entropy, a statistical measure which describes the rate at which an organism appropriates energy from the environment and reinvests this energy into survivorship and reproduction. According to directionality theory, microevolutionary dynamics, that is evolution by mutation and natural selection, can be quantitatively explained in terms of a directionality principle: Evolutionary entropy increases when the resources are diverse and of constant abundance; but decreases when the resource is singular and of variable abundance. This report reviews the analytical and empirical support for directionality theory, and invokes the microevolutionary dynamics of variation and selection to delineate the principles which govern macroevolutionary dynamics of speciation and

  15. An Econometric Diffusion Model of Exchange Rate Movements within a Band - Implications for Interest Rate Differential and Credibility of Exchange Rate Policy

    OpenAIRE

    Rantala, Olavi

    1992-01-01

    The paper presents a model ofexchange rate movements within a specified exchange rate band enforced by central bank interventions. The model is based on the empirical observation that the exchange rate has usually been strictly inside the band, at least in Finland. In this model the distribution of the exchange rate is truncated lognormal from the edges towards the center of the band and hence quite different from the bimodal distribution of the standard target zone model. The model is estima...

  16. Traffic Games: Modeling Freeway Traffic with Game Theory.

    Science.gov (United States)

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  17. New model for assessing dose and dose rate sensitivity of Gamma ray radiation loss in polarization maintaining optical fibers

    International Nuclear Information System (INIS)

    Zhang Hongchen; Liu Hai; Qiao Wenqiang; Xue Huijie; He Shiyu

    2012-01-01

    Highlights: ► Building a new phenomenological theory model to investigate the relation about the irradiation induced loss with irradiation dose and dose rate. ► The Gamma ray irradiation induced loss of the “Capsule” type and “Panda” type polarization maintaining optical fibers at 1310 nm wavelength are investigated. ► The anti irradiation performance of the “Panda” type polarization maintaining optical fiber is better than that of the “Capsule” type polarization maintaining optical fiber, the reason is that the stress region doped by GeO 2 . - Abstract: The Gamma ray irradiation induced loss of the “Capsule” type and “Panda” type polarization maintaining optical fibers at 1310 nm wavelength are investigated. A phenomenological theory model is introduced and the influence of irradiation dose and dose rate on the irradiation induced loss is discussed. The phenomenological theoretical results are consistent with the experimental results of the irradiation induced loss for the two types of polarization maintaining optical fibers. The anti irradiation performance of the “Panda” type polarization maintaining optical fiber is better than that of the “Capsule” type polarization maintaining optical fiber, the reason is that the stress region dope with GeO 2 . Meanwhile, both of the polarization maintaining optical fiber irradiation induced loss increase with increasing the irradiation dose. In the case of same dose, the high dose rate Gamma ray irradiation induced optical fiber losses are higher than that of the low dose rate.

  18. Modelling machine ensembles with discrete event dynamical system theory

    Science.gov (United States)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  19. Theories and Frameworks for Online Education: Seeking an Integrated Model

    Science.gov (United States)

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  20. Halo modelling in chameleon theories

    Energy Technology Data Exchange (ETDEWEB)

    Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  1. Halo modelling in chameleon theories

    International Nuclear Information System (INIS)

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu

    2014-01-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations

  2. Risk Route Choice Analysis and the Equilibrium Model under Anticipated Regret Theory

    Directory of Open Access Journals (Sweden)

    pengcheng yuan

    2014-02-01

    Full Text Available The assumption about travellers’ route choice behaviour has major influence on the traffic flow equilibrium analysis. Previous studies about the travellers’ route choice were mainly based on the expected utility maximization theory. However, with the gradually increasing knowledge about the uncertainty of the transportation system, the researchers have realized that there is much constraint in expected util­ity maximization theory, because expected utility maximiza­tion requires travellers to be ‘absolutely rational’; but in fact, travellers are not truly ‘absolutely rational’. The anticipated regret theory proposes an alternative framework to the tra­ditional risk-taking in route choice behaviour which might be more scientific and reasonable. We have applied the antici­pated regret theory to the analysis of the risk route choosing process, and constructed an anticipated regret utility func­tion. By a simple case which includes two parallel routes, the route choosing results influenced by the risk aversion degree, regret degree and the environment risk degree have been analyzed. Moreover, the user equilibrium model based on the anticipated regret theory has been established. The equivalence and the uniqueness of the model are proved; an efficacious algorithm is also proposed to solve the model. Both the model and the algorithm are demonstrated in a real network. By an experiment, the model results and the real data have been compared. It was found that the model re­sults can be similar to the real data if a proper regret degree parameter is selected. This illustrates that the model can better explain the risk route choosing behaviour. Moreover, it was also found that the traveller’ regret degree increases when the environment becomes more and more risky.

  3. Theory of positive disintegration as a model of adolescent development.

    Science.gov (United States)

    Laycraft, Krystyna

    2011-01-01

    This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.

  4. Baryon and lepton number violation in the Weinberg-Salam theory

    International Nuclear Information System (INIS)

    Mottola, E.

    1989-01-01

    This report discusses the concept of baryon and lepton number violation in the Weinberg-Salam theory. The topics discussed are: periodic vacua in quantum mechanics; tunnelling at finite temperature and classical thermal activation; calculation of the rate; an O(3) nonlinear sigma model; and the transition rate in the O(3) model

  5. Goodness-of-Fit Assessment of Item Response Theory Models

    Science.gov (United States)

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  6. Mixmaster cosmological model in theories of gravity with a quadratic Lagrangian

    International Nuclear Information System (INIS)

    Barrow, J.D.; Sirousse-Zia, H.

    1989-01-01

    We use the method of matched asymptotic expansions to examine the behavior of the vacuum Bianchi type-IX mixmaster universe in a gravity theory derived from a purely quadratic gravitational Lagrangian. The chaotic behavior characteristic of the general-relativistic mixmaster model disappears and the asymptotic behavior is of the monotonic, nonchaotic form found in the exactly soluble Bianchi type-I models of the quadratic theory. The asymptotic behavior far from the singularity is also found to be of monotonic nonchaotic type

  7. Models in cooperative game theory

    CERN Document Server

    Branzei, Rodica; Tijs, Stef

    2008-01-01

    This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.

  8. Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items

    Science.gov (United States)

    Aybek, Eren Can; Demirtasli, R. Nukhet

    2017-01-01

    This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…

  9. Lenses on reading an introduction to theories and models

    CERN Document Server

    Tracey, Diane H

    2017-01-01

    Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a

  10. A spatial Mankiw-Romer-Weil model: Theory and evidence

    OpenAIRE

    Fischer, Manfred M.

    2009-01-01

    This paper presents a theoretical growth model that extends the Mankiw-Romer-Weil [MRW] model by accounting for technological interdependence among regional economies. Interdependence is assumed to work through spatial externalities caused by disembodied knowledge diffusion. The transition from theory to econometrics leads to a reduced-form empirical spatial Durbin model specification that explains the variation in regional levels of per worker output at steady state. A system ...

  11. Comparison of potential models through heavy quark effective theory

    International Nuclear Information System (INIS)

    Amundson, J.F.

    1995-01-01

    I calculate heavy-light decay constants in a nonrelativistic potential model. The resulting estimate of heavy quark symmetry breaking conflicts with similar estimates from lattice QCD. I show that a semirelativistic potential model eliminates the conflict. Using the results of heavy quark effective theory allows me to identify and compensate for shortcomings in the model calculations in addition to isolating the source of the differences in the two models. The results lead to a rule as to where the nonrelativistic quark model gives misleading predictions

  12. Refined pipe theory for mechanistic modeling of wood development.

    Science.gov (United States)

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  13. The spin-s quantum Heisenberg ferromagnetic models in the physical magnon theory

    International Nuclear Information System (INIS)

    Liu, B.-G.; Pu, F.-C.

    2001-01-01

    The spin-s quantum Heisenberg ferromagnetic model is investigated in the physical magnon theory. The effect of the extra unphysical magnon states on every site is completely removed in the magnon Hamiltonian and during approximation procedure so that the condition †n i a n i >=0(n≥2s+1) is rigorously satisfied. The physical multi-magnon occupancy †n i a n i >(1≤n≤2s) is proportional to T 3n/2 at low temperature and is equivalent to 1/(2s+1) at the Curie temperature. The magnetization not only unified but also well-behaved from zero temperature to Curie temperature is obtained in the framework of the magnon theory for the spin-s quantum Heisenberg ferromagnetic model. The ill-behaved magnetizations at high temperature in earlier magnon theories are completely corrected. The relation of magnon (spin wave) theory with spin-operator decoupling theory is clearly understood

  14. Quantum analysis of Jackiw and Teitelboim's model for (1+1)D gravity and topological gauge theory

    International Nuclear Information System (INIS)

    Terao, Haruhiko

    1993-01-01

    We study the BRST quantization of the (1+1)-dimensional gravity model proposed by Jackiw and Teitelboim and also the topological gauge model which is equivalent to the gravity model at least classically. The gravity model quantized in the light-cone gauge is found to be a free theory with a nilpotent BRST charge. We show also that there exist twisted N=2 superconformal algebras in the Jackiw-Teitelboim model as well as in the topological gauge model. We discuss the quantum equivalence between the gravity theory and the topological gauge theory. It is shown that these theories are indeed equivalent to each other in the light-cone gauge. (orig.)

  15. Molecular evolutionary rates are not correlated with temperature and latitude in Squamata: an exception to the metabolic theory of ecology?

    Science.gov (United States)

    Rolland, Jonathan; Loiseau, Oriane; Romiguier, Jonathan; Salamin, Nicolas

    2016-05-20

    The metabolic theory of ecology stipulates that molecular evolutionary rates should correlate with temperature and latitude in ectothermic organisms. Previous studies have shown that most groups of vertebrates, such as amphibians, turtles and even endothermic mammals, have higher molecular evolutionary rates in regions where temperature is high. However, the association between molecular evolutionary rates and temperature or latitude has never been tested in Squamata. We used a large dataset including the spatial distributions and environmental variables for 1,651 species of Squamata and compared the contrast of the rates of molecular evolution with the contrast of temperature and latitude between sister species. Using major axis regressions and a new algorithm to choose independent sister species pairs, we found that temperature and absolute latitude were not associated with molecular evolutionary rates. This absence of association in such a diverse ectothermic group questions the mechanisms explaining current pattern of species diversity in Squamata and challenges the presupposed universality of the metabolic theory of ecology.

  16. Cross sectional efficient estimation of stochastic volatility short rate models

    NARCIS (Netherlands)

    Danilov, Dmitri; Mandal, Pranab K.

    2002-01-01

    We consider the problem of estimation of term structure of interest rates. Filtering theory approach is very natural here with the underlying setup being non-linear and non-Gaussian. Earlier works make use of Extended Kalman Filter (EKF). However, the EKF in this situation leads to inconsistent

  17. Complexity in quantum field theory and physics beyond the standard model

    International Nuclear Information System (INIS)

    Goldfain, Ervin

    2006-01-01

    Complex quantum field theory (abbreviated c-QFT) is introduced in this paper as an alternative framework for the description of physics beyond the energy range of the standard model. The mathematics of c-QFT is based on fractal differential operators that generalize the momentum operators of conventional quantum field theory (QFT). The underlying premise of our approach is that c-QFT contains the right analytical tools for dealing with the asymptotic regime of QFT. Canonical quantization of c-QFT leads to the following findings: (i) the Fock space of c-QFT includes fractional numbers of particles and antiparticles per state (ii) c-QFT represents a generalization of topological field theory and (iii) classical limit of c-QFT is equivalent to field theory in curved space-time. The first finding provides a field-theoretic motivation for the transfinite discretization approach of El-Naschie's ε (∞) theory. The second and third findings suggest the dynamic unification of boson and fermion fields as particles with fractional spin, as well as the close connection between spin and space-time topology beyond the conventional physics of the standard model

  18. Complexity in quantum field theory and physics beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Goldfain, Ervin [OptiSolve Consulting, 4422 Cleveland Road, Syracuse, NY 13215 (United States)

    2006-05-15

    Complex quantum field theory (abbreviated c-QFT) is introduced in this paper as an alternative framework for the description of physics beyond the energy range of the standard model. The mathematics of c-QFT is based on fractal differential operators that generalize the momentum operators of conventional quantum field theory (QFT). The underlying premise of our approach is that c-QFT contains the right analytical tools for dealing with the asymptotic regime of QFT. Canonical quantization of c-QFT leads to the following findings: (i) the Fock space of c-QFT includes fractional numbers of particles and antiparticles per state (ii) c-QFT represents a generalization of topological field theory and (iii) classical limit of c-QFT is equivalent to field theory in curved space-time. The first finding provides a field-theoretic motivation for the transfinite discretization approach of El-Naschie's {epsilon} {sup ({infinity}}{sup )} theory. The second and third findings suggest the dynamic unification of boson and fermion fields as particles with fractional spin, as well as the close connection between spin and space-time topology beyond the conventional physics of the standard model.

  19. A measurement theory of illusory conjunctions.

    Science.gov (United States)

    Prinzmetal, William; Ivry, Richard B; Beck, Diane; Shimizu, Naomi

    2002-04-01

    Illusory conjunctions refer to the incorrect perceptual combination of correctly perceived features, such as color and shape. Research on the phenomenon has been hampered by the lack of a measurement theory that accounts for guessing features, as well as the incorrect combination of correctly perceived features. Recently, several investigators have suggested using multinomial models as a tool for measuring feature integration. The authors examined the adequacy of these models in 2 experiments by testing whether model parameters reflect changes in stimulus factors. In a third experiment, confidence ratings were used as a tool for testing the model. Multinomial models accurately reflected both variations in stimulus factors and observers' trial-by-trial confidence ratings.

  20. A model of clearance rate regulation in mussels

    Science.gov (United States)

    Fréchette, Marcel

    2012-10-01

    Clearance rate regulation has been modelled as an instantaneous response to food availability, independent of the internal state of the animals. This view is incompatible with latent effects during ontogeny and phenotypic flexibility in clearance rate. Internal-state regulation of clearance rate is required to account for these patterns. Here I develop a model of internal-state based regulation of clearance rate. External factors such as suspended sediments are included in the model. To assess the relative merits of instantaneous regulation and internal-state regulation, I modelled blue mussel clearance rate and growth using a DEB model. In the usual standard feeding module, feeding is governed by a Holling's Type II response to food concentration. In the internal-state feeding module, gill ciliary activity and thus clearance rate are driven by internal reserve level. Factors such as suspended sediments were not included in the simulations. The two feeding modules were compared on the basis of their ability to capture the impact of latent effects, of environmental heterogeneity in food abundance and of physiological flexibility on clearance rate and individual growth. The Holling feeding module was unable to capture the effect of any of these sources of variability. In contrast, the internal-state feeding module did so without any modification or ad hoc calibration. Latent effects, however, appeared transient. With simple annual variability in temperature and food concentration, the relationship between clearance rate and food availability predicted by the internal-state feeding module was quite similar to that observed in Norwegian fjords. I conclude that in contrast with the usual Holling feeding module, internal-state regulation of clearance rate is consistent with well-documented growth and clearance rate patterns.

  1. Interacting bosons model and relation with BCS theory

    International Nuclear Information System (INIS)

    Diniz, R.

    1990-01-01

    The Nambu mechanism for BCS theory is extended with inclusion of quadrupole pairing in addition to the usual monopole pairing. An effective Hamiltonian is constructed and its relation to the IBM is discussed. The faced difficulties and a possible generalization of this model are discussed. (author)

  2. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  3. Learning more by being taught less: A "time-for-self-study" theory explaining curricular effects on graduation rate and study duration

    NARCIS (Netherlands)

    H.G. Schmidt (Henk); J. Cohen-Schotanus (Janke); H.T. van der Molen (Henk); T.A.W. Splinter (Ted); C. van den Bulte (Christophe); R. Holdrinet (Rob); H.J.M. van Rossum (Herman)

    2010-01-01

    textabstractIn this article, an alternative for Tinto's integration theory of student persistence is proposed and tested. In the proposed theory, time available for individual study is considered a major determinant of both study duration and graduation rate of students in a particular curriculum.

  4. A thermostatted kinetic theory model for event-driven pedestrian dynamics

    Science.gov (United States)

    Bianca, Carlo; Mogno, Caterina

    2018-06-01

    This paper is devoted to the modeling of the pedestrian dynamics by means of the thermostatted kinetic theory. Specifically the microscopic interactions among pedestrians and an external force field are modeled for simulating the evacuation of pedestrians from a metro station. The fundamentals of the stochastic game theory and the thermostatted kinetic theory are coupled for the derivation of a specific mathematical model which depicts the time evolution of the distribution of pedestrians at different exits of a metro station. The perturbation theory is employed in order to establish the stability analysis of the nonequilibrium stationary states in the case of a metro station consisting of two exits. A general sensitivity analysis on the initial conditions, the magnitude of the external force field and the number of exits is presented by means of numerical simulations which, in particular, show how the asymptotic distribution and the convergence time are affected by the presence of an external force field. The results show how, in evacuation conditions, the interaction dynamics among pedestrians can be negligible with respect to the external force. The important role of the thermostat term in allowing the reaching of the nonequilibrium stationary state is stressed out. Research perspectives are underlined at the end of paper, in particular for what concerns the derivation of frameworks that take into account the definition of local external actions and the introduction of the space and velocity dynamics.

  5. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  6. Theory and experiments in model-based space system anomaly management

    Science.gov (United States)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  7. Modeling of transient ionizing radiation effects in bipolar devices at high dose-rates

    International Nuclear Information System (INIS)

    FJELDLY, T.A.; DENG, Y.; SHUR, M.S.; HJALMARSON, HAROLD P.; MUYSHONDT, ARNOLDO

    2000-01-01

    To optimally design circuits for operation at high intensities of ionizing radiation, and to accurately predict their a behavior under radiation, precise device models are needed that include both stationary and dynamic effects of such radiation. Depending on the type and intensity of the ionizing radiation, different degradation mechanisms, such as photoelectric effect, total dose effect, or single even upset might be dominant. In this paper, the authors consider the photoelectric effect associated with the generation of electron-hole pairs in the semiconductor. The effects of low radiation intensity on p-II diodes and bipolar junction transistors (BJTs) were described by low-injection theory in the classical paper by Wirth and Rogers. However, in BJTs compatible with modem integrated circuit technology, high-resistivity regions are often used to enhance device performance, either as a substrate or as an epitaxial layer such as the low-doped n-type collector region of the device. Using low-injection theory, the transient response of epitaxial BJTs was discussed by Florian et al., who mainly concentrated on the effects of the Hi-Lo (high doping - low doping) epilayer/substrate junction of the collector, and on geometrical effects of realistic devices. For devices with highly resistive regions, the assumption of low-level injection is often inappropriate, even at moderate radiation intensities, and a more complete theory for high-injection levels was needed. In the dynamic photocurrent model by Enlow and Alexander. p-n junctions exposed to high-intensity radiation were considered. In their work, the variation of the minority carrier lifetime with excess carrier density, and the effects of the ohmic electric field in the quasi-neutral (q-n) regions were included in a simplified manner. Later, Wunsch and Axness presented a more comprehensive model for the transient radiation response of p-n and p-i-n diode geometries. A stationary model for high-level injection in p

  8. A multi-species exchange model for fully fluctuating polymer field theory simulations.

    Science.gov (United States)

    Düchs, Dominik; Delaney, Kris T; Fredrickson, Glenn H

    2014-11-07

    Field-theoretic models have been used extensively to study the phase behavior of inhomogeneous polymer melts and solutions, both in self-consistent mean-field calculations and in numerical simulations of the full theory capturing composition fluctuations. The models commonly used can be grouped into two categories, namely, species models and exchange models. Species models involve integrations of functionals that explicitly depend on fields originating both from species density operators and their conjugate chemical potential fields. In contrast, exchange models retain only linear combinations of the chemical potential fields. In the two-component case, development of exchange models has been instrumental in enabling stable complex Langevin (CL) simulations of the full complex-valued theory. No comparable stable CL approach has yet been established for field theories of the species type. Here, we introduce an extension of the exchange model to an arbitrary number of components, namely, the multi-species exchange (MSE) model, which greatly expands the classes of soft material systems that can be accessed by the complex Langevin simulation technique. We demonstrate the stability and accuracy of the MSE-CL sampling approach using numerical simulations of triblock and tetrablock terpolymer melts, and tetrablock quaterpolymer melts. This method should enable studies of a wide range of fluctuation phenomena in multiblock/multi-species polymer blends and composites.

  9. Dual-process models of health-related behaviour and cognition: a review of theory.

    Science.gov (United States)

    Houlihan, S

    2018-03-01

    The aim of this review was to synthesise a spectrum of theories incorporating dual-process models of health-related behaviour. Review of theory, adapted loosely from Cochrane-style systematic review methodology. Inclusion criteria were specified to identify all relevant dual-process models that explain decision-making in the context of decisions made about human health. Data analysis took the form of iterative template analysis (adapted from the conceptual synthesis framework used in other reviews of theory), and in this way theories were synthesised on the basis of shared theoretical constructs and causal pathways. Analysis and synthesis proceeded in turn, instead of moving uni-directionally from analysis of individual theories to synthesis of multiple theories. Namely, the reviewer considered and reconsidered individual theories and theoretical components in generating the narrative synthesis' main findings. Drawing on systematic review methodology, 11 electronic databases were searched for relevant dual-process theories. After de-duplication, 12,198 records remained. Screening of title and abstract led to the exclusion of 12,036 records, after which 162 full-text records were assessed. Of those, 21 records were included in the review. Moving back and forth between analysis of individual theories and the synthesis of theories grouped on the basis of theme or focus yielded additional insights into the orientation of a theory to an individual. Theories could be grouped in part on their treatment of an individual as an irrational actor, as social actor, as actor in a physical environment or as a self-regulated actor. Synthesising identified theories into a general dual-process model of health-related behaviour indicated that such behaviour is the result of both propositional and unconscious reasoning driven by an individual's response to internal cues (such as heuristics, attitude and affect), physical cues (social and physical environmental stimuli) as well as

  10. Two-dimensional sigma models: modelling non-perturbative effects of gauge theories

    International Nuclear Information System (INIS)

    Novikov, V.A.; Shifman, M.A.; Vainshtein, A.I.; Zakharov, V.I.

    1984-01-01

    The review is devoted to a discussion of non-perturbative effects in gauge theories and two-dimensional sigma models. The main emphasis is put on supersymmetric 0(3) sigma model. The instanton-based method for calculating the exact Gell-Mann-Low function and bifermionic condensate is considered in detail. All aspects of the method in simplifying conditions are discussed. The basic points are: the instanton measure from purely classical analysis; a non-renormalization theorem in self-dual external fields; existence of vacuum condensates and their compatibility with supersymmetry

  11. The biopsychosocial model and its potential for a new theory of homeopathy.

    Science.gov (United States)

    Schmidt, Josef M

    2012-04-01

    Since the nineteenth century the theory of conventional medicine has been developed in close alignment with the mechanistic paradigm of natural sciences. Only in the twentieth century occasional attempts were made to (re)introduce the 'subject' into medical theory, as by Thure von Uexküll (1908-2004) who elaborated the so-called biopsychosocial model of the human being, trying to understand the patient as a unit of organic, mental, and social dimensions of life. Although widely neglected by conventional medicine, it is one of the most coherent, significant, and up-to-date models of medicine at present. Being torn between strict adherence to Hahnemann's original conceptualization and alienation caused by contemporary scientific criticism, homeopathy today still lacks a generally accepted, consistent, and definitive theory which would explain in scientific terms its strength, peculiarity, and principles without relapsing into biomedical reductionism. The biopsychosocial model of the human being implies great potential for a new theory of homeopathy, as may be demonstrated with some typical examples. Copyright © 2012. Published by Elsevier Ltd.

  12. Fluid analog model for boundary effects in field theory

    International Nuclear Information System (INIS)

    Ford, L. H.; Svaiter, N. F.

    2009-01-01

    Quantum fluctuations in the density of a fluid with a linear phonon dispersion relation are studied. In particular, we treat the changes in these fluctuations due to nonclassical states of phonons and to the presence of boundaries. These effects are analogous to similar effects in relativistic quantum field theory, and we argue that the case of the fluid is a useful analog model for effects in field theory. We further argue that the changes in the mean squared density are, in principle, observable by light scattering experiments.

  13. Cross sectional efficient estimation of stochastic volatility short rate models

    NARCIS (Netherlands)

    Danilov, Dmitri; Mandal, Pranab K.

    2001-01-01

    We consider the problem of estimation of term structure of interest rates. Filtering theory approach is very natural here with the underlying setup being non-linear and non-Gaussian. Earlier works make use of Extended Kalman Filter (EKF). However, as indicated by de Jong (2000), the EKF in this

  14. Non-Higgsable clusters for 4D F-theory models

    International Nuclear Information System (INIS)

    Morrison, David R.; Taylor, Washington

    2015-01-01

    We analyze non-Higgsable clusters of gauge groups and matter that can arise at the level of geometry in 4D F-theory models. Non-Higgsable clusters seem to be generic features of F-theory compactifications, and give rise naturally to structures that include the nonabelian part of the standard model gauge group and certain specific types of potential dark matter candidates. In particular, there are nine distinct single nonabelian gauge group factors, and only five distinct products of two nonabelian gauge group factors with matter, including SU(3)×SU(2), that can be realized through 4D non-Higgsable clusters. There are also more complicated configurations involving more than two gauge factors; in particular, the collection of gauge group factors with jointly charged matter can exhibit branchings, loops, and long linear chains.

  15. Multistate cohort models with proportional transfer rates

    DEFF Research Database (Denmark)

    Schoen, Robert; Canudas-Romo, Vladimir

    2006-01-01

    of transfer rates. The two living state case and hierarchical multistate models with any number of living states are analyzed in detail. Applying our approach to 1997 U.S. fertility data, we find that observed rates of parity progression are roughly proportional over age. Our proportional transfer rate...... approach provides trajectories by parity state and facilitates analyses of the implications of changes in parity rate levels and patterns. More women complete childbearing at parity 2 than at any other parity, and parity 2 would be the modal parity in models with total fertility rates (TFRs) of 1.40 to 2......We present a new, broadly applicable approach to summarizing the behavior of a cohort as it moves through a variety of statuses (or states). The approach is based on the assumption that all rates of transfer maintain a constant ratio to one another over age. We present closed-form expressions...

  16. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...

  17. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been studied and some assumptions among the ...

  18. S matrix theory of the massive Thirring model

    International Nuclear Information System (INIS)

    Berg, B.

    1980-01-01

    The S matrix theory of the massive Thirring model, describing the exact quantum scattering of solitons and their boundstates, is reviewed. Treated are: Factorization equations and their solution, boundstates, generalized Jost functions and Levinson's theorem, scattering of boundstates, 'virtual' and anomalous thresholds. (orig.) 891 HSI/orig. 892 MKO

  19. Introduction of the transtheoretical model and organisational development theory in weight management: A narrative review.

    Science.gov (United States)

    Wu, Ya-Ke; Chu, Nain-Feng

    2015-01-01

    Overweight and obesity are serious public health and medical problems among children and adults worldwide. Behavioural change has been demonstrably contributory to weight management programs. Behavioural change-based weight loss programs require a theoretical framework. We will review the transtheoretical model and the organisational development theory in weight management. The transtheoretical model is a behaviour theory of individual level frequently used for weight management programs. The organisational development theory is a more complicated behaviour theory that applies to behavioural change on the system level. Both of these two theories have their respective strengths and weaknesses. In this manuscript, we try to introduce the transtheoretical model and the organisational development theory in the context of weight loss programs among population that are overweight or obese. Ultimately, we wish to present a new framework/strategy of weight management by integrating these two theories together. Copyright © 2015 Asian Oceanian Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  20. Learning more by being taught less : a "time-for-self-study" theory explaining curricular effects on graduation rate and study duration

    NARCIS (Netherlands)

    Schmidt, H.G.; Cohen-Schotanus, J.; van der Molen, H.T.; Splinter, T.A.W.; Bulte, J.; Holdrinet, R.; van Rossum, H.J.M.

    In this article, an alternative for Tinto's integration theory of student persistence is proposed and tested. In the proposed theory, time available for individual study is considered a major determinant of both study duration and graduation rate of students in a particular curriculum. In this view,

  1. Modeling Composite Assessment Data Using Item Response Theory

    Science.gov (United States)

    Ueckert, Sebastian

    2018-01-01

    Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119

  2. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  3. The elastic transfer model of angular rate modulation in F1-ATPase stalling and controlled rotation experiments

    Science.gov (United States)

    Volkán-Kacsó, S.

    2017-06-01

    The recent experimental, theoretical and computational advances in the field of F1-ATPase single-molecule microscopy are briefly surveyed. The role of theory is revealed in the statistical analysis, interpretation and prediction of single-molecule experimental trajectories, and in linking them with atomistic simulations. In particular, a theoretical model of elastically coupled molecular group transfer is reviewed and a detailed method for its application in stalling and controlled rotation experiments is provided. It is shown how the model can predict, using previous experiments, the rates of ligand binding/release processes (steps) and their exponential dependence on rotor angle in these experiments. The concept of Brønsted slopes is reviewed in the context of the single-molecule experiments, and the rate versus rotor angle relations are explained using the elastic model. These experimental data are treated in terms of the effect of thermodynamic driving forces on the rates assuming that the rotor shaft is elastically coupled to stator ring subunits in which the steps occur. In the application of the group transfer model on an extended angular range processes leading up to the transfer are discussed. Implications for large-scale atomistic simulation are suggested for the treatment of torque-generating steps.

  4. Halbwachs and Durkheim: a test of two theories of suicide.

    Science.gov (United States)

    Travis, R

    1990-06-01

    The social integration hypothesis forms the basis of this study. It was first asserted by Durkheim in late nineteenth-century France and many of his assumptions are based on a social disorganizational model. This model tended to equate social change with the breakdown of social control and many of Durkheim's notions about anomie are derived from this view of industrial society. Halbwachs, on the other hand, proposed a social psychological theory of suicide. His model specifies more clearly the conditions under which lack of social integration may induce suicide. This study shows that among a population in transition, the Alaska Natives, the suicide rate was explained by the Halbwachsian model at least as well as the Durkheimian one and sometimes better. The Durkheimian model is shown to reflect a Cartesian dualism, which accounts only for that which is observable, thus making for biased studies of suicide. Moreover, psychopathological research confirms the Halbwachsian model. These findings restore the social isolation theory, once long neglected, to its rightful place among theories of suicide and opens up an important field for researchers seeking to understand high rates of suicide.

  5. A General Framework for Portfolio Theory. Part I: theory and various models

    OpenAIRE

    Maier-Paape, Stanislaus; Zhu, Qiji Jim

    2017-01-01

    Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz [H. Markowitz, Portfolio Selection, 1959] and its natural generalization, the capital market pricing model, [W. F. Sharpe, Mutual fund performance , 1966] are spe...

  6. Computer Support of Groups: Theory-Based Models for GDSS Research

    OpenAIRE

    V. Srinivasan Rao; Sirkka L. Jarvenpaa

    1991-01-01

    Empirical research in the area of computer support of groups is characterized by inconsistent results across studies. This paper attempts to reconcile the inconsistencies by linking the ad hoc reasoning in the studies to existing theories of communication, minority influence and human information processing. Contingency models are then presented based on the theories discussed. The paper concludes by discussing the linkages between the current work and other recently published integrations of...

  7. Direct calculation of ice homogeneous nucleation rate for a molecular model of water

    Science.gov (United States)

    Haji-Akbari, Amir; Debenedetti, Pablo G.

    2015-01-01

    Ice formation is ubiquitous in nature, with important consequences in a variety of environments, including biological cells, soil, aircraft, transportation infrastructure, and atmospheric clouds. However, its intrinsic kinetics and microscopic mechanism are difficult to discern with current experiments. Molecular simulations of ice nucleation are also challenging, and direct rate calculations have only been performed for coarse-grained models of water. For molecular models, only indirect estimates have been obtained, e.g., by assuming the validity of classical nucleation theory. We use a path sampling approach to perform, to our knowledge, the first direct rate calculation of homogeneous nucleation of ice in a molecular model of water. We use TIP4P/Ice, the most accurate among existing molecular models for studying ice polymorphs. By using a novel topological approach to distinguish different polymorphs, we are able to identify a freezing mechanism that involves a competition between cubic and hexagonal ice in the early stages of nucleation. In this competition, the cubic polymorph takes over because the addition of new topological structural motifs consistent with cubic ice leads to the formation of more compact crystallites. This is not true for topological hexagonal motifs, which give rise to elongated crystallites that are not able to grow. This leads to transition states that are rich in cubic ice, and not the thermodynamically stable hexagonal polymorph. This mechanism provides a molecular explanation for the earlier experimental and computational observations of the preference for cubic ice in the literature. PMID:26240318

  8. Category Theory as a Formal Mathematical Foundation for Model-Based Systems Engineering

    KAUST Repository

    Mabrok, Mohamed; Ryan, Michael J.

    2017-01-01

    In this paper, we introduce Category Theory as a formal foundation for model-based systems engineering. A generalised view of the system based on category theory is presented, where any system can be considered as a category. The objects

  9. Political pressures and exchange rate stability in emerging market economies

    OpenAIRE

    Ester Faia; Massimo Giuliodori; Michele Ruta

    2008-01-01

    This paper presents a political economy model of exchange rate policy. The theory is based on a common agency approach with rational expectations. Financial and exporter lobbies exert political pressures to influence the government’s choice of exchange rate policy, before shocks to the economy are realized. The model shows that political pressures affect exchange rate policy and create an over-commitment to exchange rate stability. This helps to rationalize the empirical evidence on fear of l...

  10. Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories

    International Nuclear Information System (INIS)

    Wells, James

    2015-01-01

    The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyond what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more

  11. A new k-epsilon model consistent with Monin-Obukhov similarity theory

    DEFF Research Database (Denmark)

    van der Laan, Paul; Kelly, Mark C.; Sørensen, Niels N.

    2017-01-01

    A new k-" model is introduced that is consistent with Monin–Obukhov similarity theory (MOST). The proposed k-" model is compared with another k-" model that was developed in an attempt to maintain inlet profiles compatible with MOST. It is shown that the previous k-" model is not consistent with ...

  12. Compositional models and conditional independence in evidence theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim; Vejnarová, Jiřina

    2011-01-01

    Roč. 52, č. 3 (2011), s. 316-334 ISSN 0888-613X Institutional research plan: CEZ:AV0Z10750506 Keywords : Evidence theory * Conditional independence * multidimensional models Subject RIV: BA - General Mathematics Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-0370515.pdf

  13. Confirmation of linear system theory prediction: Rate of change of Herrnstein's κ as a function of response-force requirement

    Science.gov (United States)

    McDowell, J. J; Wood, Helena M.

    1985-01-01

    Four human subjects worked on all combinations of five variable-interval schedules and five reinforcer magnitudes (¢/reinforcer) in each of two phases of the experiment. In one phase the force requirement on the operandum was low (1 or 11 N) and in the other it was high (25 or 146 N). Estimates of Herrnstein's κ were obtained at each reinforcer magnitude. The results were: (1) response rate was more sensitive to changes in reinforcement rate at the high than at the low force requirement, (2) κ increased from the beginning to the end of the magnitude range for all subjects at both force requirements, (3) the reciprocal of κ was a linear function of the reciprocal of reinforcer magnitude for seven of the eight data sets, and (4) the rate of change of κ was greater at the high than at the low force requirement by an order of magnitude or more. The second and third findings confirm predictions made by linear system theory, and replicate the results of an earlier experiment (McDowell & Wood, 1984). The fourth finding confirms a further prediction of the theory and supports the theory's interpretation of conflicting data on the constancy of Herrnstein's κ. PMID:16812408

  14. Confirmation of linear system theory prediction: Rate of change of Herrnstein's kappa as a function of response-force requirement.

    Science.gov (United States)

    McDowell, J J; Wood, H M

    1985-01-01

    Four human subjects worked on all combinations of five variable-interval schedules and five reinforcer magnitudes ( cent/reinforcer) in each of two phases of the experiment. In one phase the force requirement on the operandum was low (1 or 11 N) and in the other it was high (25 or 146 N). Estimates of Herrnstein's kappa were obtained at each reinforcer magnitude. The results were: (1) response rate was more sensitive to changes in reinforcement rate at the high than at the low force requirement, (2) kappa increased from the beginning to the end of the magnitude range for all subjects at both force requirements, (3) the reciprocal of kappa was a linear function of the reciprocal of reinforcer magnitude for seven of the eight data sets, and (4) the rate of change of kappa was greater at the high than at the low force requirement by an order of magnitude or more. The second and third findings confirm predictions made by linear system theory, and replicate the results of an earlier experiment (McDowell & Wood, 1984). The fourth finding confirms a further prediction of the theory and supports the theory's interpretation of conflicting data on the constancy of Herrnstein's kappa.

  15. Theory-based Bayesian models of inductive learning and reasoning.

    Science.gov (United States)

    Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

    2006-07-01

    Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

  16. The Simplest Unified Growth Theory

    DEFF Research Database (Denmark)

    Strulik, Holger; Weisdorf, Jacob Louis

    This paper provides a unified growth theory, i.e. a model that explains the very long-run economic and demographic development path of industrialized economies, stretching from the pre-industrial era to present-day and beyond. Making strict use of Malthus' (1798) so-called preventive check...... hypothesis - that fertility rates vary inversely with the price of food - the current study offers a new and straightforward explanation for the demographic transition and the break with the Malthusian era. The current framework lends support to existing unified growth theories and is well in tune...

  17. Nurses' intention to leave: critically analyse the theory of reasoned action and organizational commitment model.

    Science.gov (United States)

    Liou, Shwu-Ru

    2009-01-01

    To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.

  18. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  19. Categories of relations as models of quantum theory

    Directory of Open Access Journals (Sweden)

    Chris Heunen

    2015-11-01

    Full Text Available Categories of relations over a regular category form a family of models of quantum theory. Using regular logic, many properties of relations over sets lift to these models, including the correspondence between Frobenius structures and internal groupoids. Over compact Hausdorff spaces, this lifting gives continuous symmetric encryption. Over a regular Mal'cev category, this correspondence gives a characterization of categories of completely positive maps, enabling the formulation of quantum features. These models are closer to Hilbert spaces than relations over sets in several respects: Heisenberg uncertainty, impossibility of broadcasting, and behavedness of rank one morphisms.

  20. Queues and Lévy fluctuation theory

    CERN Document Server

    Dębicki, Krzysztof

    2015-01-01

    The book provides an extensive introduction to queueing models driven by Lévy-processes as well as a systematic account of the literature on Lévy-driven queues. The objective is to make the reader familiar with the wide set of probabilistic techniques that have been developed over the past decades, including transform-based techniques, martingales, rate-conservation arguments, change-of-measure, importance sampling, and large deviations. On the application side, it demonstrates how Lévy traffic models arise when modelling current queueing-type systems (as communication networks) and includes applications to finance. Queues and Lévy Fluctuation Theory will appeal to graduate/postgraduate students and researchers in mathematics, computer science, and electrical engineering. Basic prerequisites are probability theory and stochastic processes.