WorldWideScience

Sample records for theory instrumentation model

  1. Theory, modeling and instrumentation for materials by design: Proceedings of workshop

    Energy Technology Data Exchange (ETDEWEB)

    Allen, R.E.; Cocke, D.L.; Eberhardt, J.J.; Wilson, A. (eds.)

    1984-01-01

    The following topics are contained in this volume: how can materials theory benefit from supercomputers and vice-versa; the materials of xerography; relationship between ab initio and semiempirical theories of electronic structure and renormalization group and the statistical mechanics of polymer systems; ab initio calculations of materials properties; metals in intimate contact; lateral interaction in adsorption: revelations from phase transitions; quantum model of thermal desorption and laser stimulated desorption; extended fine structure in appearance potential spectroscopy as a probe of solid surfaces; structural aspects of band offsets at heterojunction interfaces; multiconfigurational Green's function approach to quantum chemistry; wavefunctions and charge densities for defects in solids: a success for semiempirical theory; empirical methods for predicting the phase diagrams of intermetallic alloys; theoretical considerations regarding impurities in silicon and the chemisorption of simple molecules on Ni; improved Kohn-Sham exchange potential; structural stability calculations for films and crystals; semiempirical molecular orbital modeling of catalytic reactions including promoter effects; theoretical studies of chemical reactions: hydrolysis of formaldehyde; electronic structure calculations for low coverage adlayers; present status of the many-body problem; atomic scattering as a probe of physical adsorption; and, discussion of theoretical techniques in quantum chemistry and solid state physics.

  2. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  3. Theory of Compliance: Indicator Checklist Statistical Model and Instrument Based Program Monitoring Information System.

    Science.gov (United States)

    Fiene, Richard J.; Woods, Lawrence

    Two unanswered questions about child care are: (1) Does compliance with state child care regulations have a positive impact on children? and (2) Have predictors of program quality been identified? This paper explores a research study and related model that have had some success in answering these questions. Section I, a general introduction,…

  4. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  5. netherland hydrological modeling instrument

    Science.gov (United States)

    Hoogewoud, J. C.; de Lange, W. J.; Veldhuizen, A.; Prinsen, G.

    2012-04-01

    Netherlands Hydrological Modeling Instrument A decision support system for water basin management. J.C. Hoogewoud , W.J. de Lange ,A. Veldhuizen , G. Prinsen , The Netherlands Hydrological modeling Instrument (NHI) is the center point of a framework of models, to coherently model the hydrological system and the multitude of functions it supports. Dutch hydrological institutes Deltares, Alterra, Netherlands Environmental Assessment Agency, RWS Waterdienst, STOWA and Vewin are cooperating in enhancing the NHI for adequate decision support. The instrument is used by three different ministries involved in national water policy matters, for instance the WFD, drought management, manure policy and climate change issues. The basis of the modeling instrument is a state-of-the-art on-line coupling of the groundwater system (MODFLOW), the unsaturated zone (metaSWAP) and the surface water system (MOZART-DM). It brings together hydro(geo)logical processes from the column to the basin scale, ranging from 250x250m plots to the river Rhine and includes salt water flow. The NHI is validated with an eight year run (1998-2006) with dry and wet periods. For this run different parts of the hydrology have been compared with measurements. For instance, water demands in dry periods (e.g. for irrigation), discharges at outlets, groundwater levels and evaporation. A validation alone is not enough to get support from stakeholders. Involvement from stakeholders in the modeling process is needed. There fore to gain sufficient support and trust in the instrument on different (policy) levels a couple of actions have been taken: 1. a transparent evaluation of modeling-results has been set up 2. an extensive program is running to cooperate with regional waterboards and suppliers of drinking water in improving the NHI 3. sharing (hydrological) data via newly setup Modeling Database for local and national models 4. Enhancing the NHI with "local" information. The NHI is and has been used for many

  6. Instrument Modeling and Synthesis

    Science.gov (United States)

    Horner, Andrew B.; Beauchamp, James W.

    During the 1970s and 1980s, before synthesizers based on direct sampling of musical sounds became popular, replicating musical instruments using frequency modulation (FM) or wavetable synthesis was one of the “holy grails” of music synthesis. Synthesizers such as the Yamaha DX7 allowed users great flexibility in mixing and matching sounds, but were notoriously difficult to coerce into producing sounds like those of a given instrument. Instrument design wizards practiced the mysteries of FM instrument design.

  7. Data, instruments, and theory a dialectical approach to understanding science

    CERN Document Server

    Ackermann, Robert John

    1985-01-01

    Robert John Ackermann deals decisively with the problem of relativism that has plagued post-empiricist philosophy of science. Recognizing that theory and data are mediated by data domains (bordered data sets produced by scientific instruments), he argues that the use of instruments breaks the dependency of observation on theory and thus creates a reasoned basis for scientific objectivity.

  8. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  9. Discrete-time modelling of musical instruments

    International Nuclear Information System (INIS)

    Vaelimaeki, Vesa; Pakarinen, Jyri; Erkut, Cumhur; Karjalainen, Matti

    2006-01-01

    This article describes physical modelling techniques that can be used for simulating musical instruments. The methods are closely related to digital signal processing. They discretize the system with respect to time, because the aim is to run the simulation using a computer. The physics-based modelling methods can be classified as mass-spring, modal, wave digital, finite difference, digital waveguide and source-filter models. We present the basic theory and a discussion on possible extensions for each modelling technique. For some methods, a simple model example is chosen from the existing literature demonstrating a typical use of the method. For instance, in the case of the digital waveguide modelling technique a vibrating string model is discussed, and in the case of the wave digital filter technique we present a classical piano hammer model. We tackle some nonlinear and time-varying models and include new results on the digital waveguide modelling of a nonlinear string. Current trends and future directions in physical modelling of musical instruments are discussed

  10. Modeling students' instrumental (mis-) use of substances to enhance cognitive performance: Neuroenhancement in the light of job demands-resources theory.

    Science.gov (United States)

    Wolff, Wanja; Brand, Ralf; Baumgarten, Franz; Lösel, Johanna; Ziegler, Matthias

    2014-01-01

    Healthy university students have been shown to use psychoactive substances, expecting them to be functional means for enhancing their cognitive capacity, sometimes over and above an essentially proficient level. This behavior called Neuroenhancement (NE) has not yet been integrated into a behavioral theory that is able to predict performance. Job Demands Resources (JD-R) Theory for example assumes that strain (e.g. burnout) will occur and influence performance when job demands are high and job resources are limited at the same time. The aim of this study is to investigate whether or not university students' self-reported NE can be integrated into JD-R Theory's comprehensive approach to psychological health and performance. 1,007 students (23.56 ± 3.83 years old, 637 female) participated in an online survey. Lifestyle drug, prescription drug, and illicit substance NE together with the complete set of JD-R variables (demands, burnout, resources, motivation, and performance) were measured. Path models were used in order to test our data's fit to hypothesized main effects and interactions. JD-R Theory could successfully be applied to describe the situation of university students. NE was mainly associated with the JD-R Theory's health impairment process: Lifestyle drug NE (p performance. From a public health perspective, intervention strategies should address these costs of non-supervised NE. With regard to future research we propose to model NE as a means to reach an end (i.e. performance enhancement) rather than a target behavior itself. This is necessary to provide a deeper understanding of the behavioral roots and consequences of the phenomenon.

  11. Expectancy Theory Modeling

    Science.gov (United States)

    1982-08-01

    accomplish the task, (2) the instrumentality of task performance for job outcomes, and (3) the instrumentality of outcomes for need satisfaction . We...in this discussion: effort, performance , outcomes, and needs. In order to present briefly the conventional approach to the Vroom models, another...Presumably, this is the final event in the sequence of effort, performance , outcome, and need satisfaction . The actual research reported in expectancy

  12. Asteroid electrostatic instrumentation and modelling

    Energy Technology Data Exchange (ETDEWEB)

    Aplin, K L; Bowles, N E; Urbak, E [Department of Physics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Keane, D; Sawyer, E C, E-mail: k.aplin1@physics.ox.ac.uk [RAL Space, R25, Harwell Oxford, Didcot OX11 0QX (United Kingdom)

    2011-06-23

    Asteroid surface material is expected to become photoelectrically charged, and is likely to be transported through electrostatic levitation. Understanding any movement of the surface material is relevant to proposed space missions to return samples to Earth for detailed isotopic analysis. Motivated by preparations for the Marco Polo sample return mission, we present electrostatic modelling for a real asteroid, Itokawa, for which detailed shape information is available, and verify that charging effects are likely to be significant at the terminator and at the edges of shadow regions for the Marco Polo baseline asteroid, 1999JU3. We also describe the Asteroid Charge Experiment electric field instrumentation intended for Marco Polo. Finally, we find that the differing asteroid and spacecraft potentials on landing could perturb sample collection for the short landing time of 20min that is currently planned.

  13. Instrumental traditions and theories of light the uses of instruments in the optical revolution

    CERN Document Server

    Chen, Xiang

    2000-01-01

    An analysis of the optical revolution in the context of early 19th century Britain. Far from merely involving the replacement of one optical theory by another, the revolution also involved substantial changes in instruments and the practices that surrounded them. People's judgements about classification, explanation and evaluation were affected by the way they used such optical instruments as spectroscopes, telescopes, polarisers, photometers, gratings, prisms and apertures. There were two instrumental traditions in this historical period, each of which nurtured a body of practice that exemplified how optical instruments should be operated, and especially how the eye should be used. These traditions functioned just like paradigms, shaping perspectives and even world views. Readership: Scholars and graduate students in the history of science, history of instrument, philosophy of science and science studies. Can also be used as a textbook in graduate courses on 19th century physics.

  14. Towards a Transcultural Theory of Democracy for Instrumental Music Education

    Science.gov (United States)

    Tan, Leonard

    2014-01-01

    At present, instrumental music education, defined in this paper as the teaching and learning of music through wind bands and symphony orchestras of Western origin, appears embattled. Among the many criticisms made against instrumental music education, critics claim that bands and orchestras exemplify an authoritarian model of teaching that does…

  15. Modeling students’ instrumental (mis-) use of substances to enhance cognitive performance: Neuroenhancement in the light of job demands-resources theory

    Science.gov (United States)

    2014-01-01

    Background Healthy university students have been shown to use psychoactive substances, expecting them to be functional means for enhancing their cognitive capacity, sometimes over and above an essentially proficient level. This behavior called Neuroenhancement (NE) has not yet been integrated into a behavioral theory that is able to predict performance. Job Demands Resources (JD-R) Theory for example assumes that strain (e.g. burnout) will occur and influence performance when job demands are high and job resources are limited at the same time. The aim of this study is to investigate whether or not university students’ self-reported NE can be integrated into JD-R Theory’s comprehensive approach to psychological health and performance. Methods 1,007 students (23.56 ± 3.83 years old, 637 female) participated in an online survey. Lifestyle drug, prescription drug, and illicit substance NE together with the complete set of JD-R variables (demands, burnout, resources, motivation, and performance) were measured. Path models were used in order to test our data’s fit to hypothesized main effects and interactions. Results JD-R Theory could successfully be applied to describe the situation of university students. NE was mainly associated with the JD-R Theory’s health impairment process: Lifestyle drug NE (p model NE as a means to reach an end (i.e. performance enhancement) rather than a target behavior itself. This is necessary to provide a deeper understanding of the behavioral roots and consequences of the phenomenon. PMID:24904687

  16. A hierarchical instrumental decision theory of nicotine dependence.

    Science.gov (United States)

    Hogarth, Lee; Troisi, Joseph R

    2015-01-01

    It is important to characterize the learning processes governing tobacco-seeking in order to understand how best to treat this behavior. Most drug learning theories have adopted a Pavlovian framework wherein the conditioned response is the main motivational process. We favor instead a hierarchical instrumental decision account, wherein expectations about the instrumental contingency between voluntary tobacco-seeking and the receipt of nicotine reward determines the probability of executing this behavior. To support this view, we review titration and nicotine discrimination research showing that internal signals for deprivation/satiation modulate expectations about the current incentive value of smoking, thereby modulating the propensity of this behavior. We also review research on cue-reactivity which has shown that external smoking cues modulate expectations about the probability of the tobacco-seeking response being effective, thereby modulating the propensity of this behavior. Economic decision theory is then considered to elucidate how expectations about the value and probability of response-nicotine contingency are integrated to form an overall utility estimate for that option for comparison with qualitatively different, nonsubstitute reinforcers, to determine response selection. As an applied test for this hierarchical instrumental decision framework, we consider how well it accounts for individual liability to smoking uptake and perseveration, pharmacotherapy, cue-extinction therapies, and plain packaging. We conclude that the hierarchical instrumental account is successful in reconciling this broad range of phenomenon precisely because it accepts that multiple diverse sources of internal and external information must be integrated to shape the decision to smoke.

  17. Theory, Instrumentation and Applications of Magnetoelastic Resonance Sensors: A Review

    Science.gov (United States)

    Grimes, Craig A.; Roy, Somnath C.; Rani, Sanju; Cai, Qingyun

    2011-01-01

    Thick-film magnetoelastic sensors vibrate mechanically in response to a time varying magnetic excitation field. The mechanical vibrations of the magnetostrictive magnetoelastic material launch, in turn, a magnetic field by which the sensor can be monitored. Magnetic field telemetry enables contact-less, remote-query operation that has enabled many practical uses of the sensor platform. This paper builds upon a review paper we published in Sensors in 2002 (Grimes, C.A.; et al. Sensors 2002, 2, 294–313), presenting a comprehensive review on the theory, operating principles, instrumentation and key applications of magnetoelastic sensing technology. PMID:22163768

  18. Theory, Instrumentation and Applications of Magnetoelastic Resonance Sensors: A Review

    Directory of Open Access Journals (Sweden)

    Craig A. Grimes

    2011-03-01

    Full Text Available Thick-film magnetoelastic sensors vibrate mechanically in response to a time varying magnetic excitation field. The mechanical vibrations of the magnetostrictive magnetoelastic material launch, in turn, a magnetic field by which the sensor can be monitored. Magnetic field telemetry enables contact-less, remote-query operation that has enabled many practical uses of the sensor platform. This paper builds upon a review paper we published in Sensors in 2002 (Grimes, C.A.; et al. Sensors 2002, 2, 294-313, presenting a comprehensive review on the theory, operating principles, instrumentation and key applications of magnetoelastic sensing technology.

  19. [Instrument to measure adherence in hypertensive patients: contribution of Item Response Theory].

    Science.gov (United States)

    Rodrigues, Malvina Thaís Pacheco; Moreira, Thereza Maria Magalhaes; Vasconcelos, Alexandre Meira de; Andrade, Dalton Francisco de; Silva, Daniele Braz da; Barbetta, Pedro Alberto

    2013-06-01

    To analyze, by means of "Item Response Theory", an instrument to measure adherence to t treatment for hypertension. Analytical study with 406 hypertensive patients with associated complications seen in primary care in Fortaleza, CE, Northeastern Brazil, 2011 using "Item Response Theory". The stages were: dimensionality test, calibrating the items, processing data and creating a scale, analyzed using the gradual response model. A study of the dimensionality of the instrument was conducted by analyzing the polychoric correlation matrix and factor analysis of complete information. Multilog software was used to calibrate items and estimate the scores. Items relating to drug therapy are the most directly related to adherence while those relating to drug-free therapy need to be reworked because they have less psychometric information and low discrimination. The independence of items, the small number of levels in the scale and low explained variance in the adjustment of the models show the main weaknesses of the instrument analyzed. The "Item Response Theory" proved to be a relevant analysis technique because it evaluated respondents for adherence to treatment for hypertension, the level of difficulty of the items and their ability to discriminate between individuals with different levels of adherence, which generates a greater amount of information. The instrument analyzed is limited in measuring adherence to hypertension treatment, by analyzing the "Item Response Theory" of the item, and needs adjustment. The proper formulation of the items is important in order to accurately measure the desired latent trait.

  20. Theory and modeling group

    Science.gov (United States)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  1. Model SH intelligent instrument for thickness measuring

    International Nuclear Information System (INIS)

    Liu Juntao; Jia Weizhuang; Zhao Yunlong

    1995-01-01

    The authors introduce Model SH Intelligent Instrument for thickness measuring by using principle of beta back-scattering and its application range, features, principle of operation, system design, calibration and specifications

  2. Creating and purifying an observation instrument using the generalizability theory

    Directory of Open Access Journals (Sweden)

    Elena Rodríguez-Naveiras

    2013-12-01

    Full Text Available The control of quality of data it is one of the most relevant aspects in observational researches. The Generalizability Theory (GT provides a method of analysis that allows us to isolate the various sources of error measurement. At the same time, it helps us to determine the extent to which various factors can change and analyze the effect on the generalizability coefficient. In the work shown here, there are two studies aimed to creating and purifying an observation instrument, Observation Protocol in the Teaching Functions (Protocolo de Funciones Docentes, PROFUNDO, v1 and v2, for behavioral assessment which has been carried out by instructors in a social-affective out-of-school program. The reliability and homogeneity studies are carried out once the instrument has been created and purified. The reliability study will be done through the GT method taking both codes (c and agents (a as differential facets in. The generalization will be done through observers using a crossed multi-faceted design (A × O × C. In the homogeneity study the generalization facet will be done through codes using the same design that the reliability study.

  3. Derivative instruments a guide to theory and practice

    CERN Document Server

    Eales, Brian

    2003-01-01

    The authors concentrate on the practicalities of each class of derivative, so that readers can apply the techniques in practice. Product descriptions are supported by detailed spreadsheet models, illustrating the techniques employed, some which are available on the accompanying companion website. This book is ideal reading for derivatives traders, salespersons, financial engineers, risk managers, and other professionals involved to any extent in the application and analysis of OTC derivatives.* Combines theory with valuation to provide overall coverage of the topic area* Pr

  4. Understanding practice change in community pharmacy: a qualitative research instrument based on organisational theory.

    Science.gov (United States)

    Roberts, Alison S; Hopp, Trine; Sørensen, Ellen Westh; Benrimoj, Shalom I; Chen, Timothy F; Herborg, Hanne; Williams, Kylie; Aslani, Parisa

    2003-10-01

    The past decade has seen a notable shift in the practice of pharmacy, with a strong focus on the provision of cognitive pharmaceutical services (CPS) by community pharmacists. The benefits of these services have been well documented, yet their uptake appears to be slow. Various strategies have been developed to overcome barriers to the implementation of CPS, with varying degrees of success, and little is known about the sustainability of the practice changes they produce. Furthermore, the strategies developed are often specific to individual programs or services, and their applicability to other CPS has not been explored. There seems to be a need for a flexible change management model for the implementation and dissemination of a range of CPS, but before it can be developed, a better understanding of the change process is required. This paper describes the development of a qualitative research instrument that may be utilised to investigate practice change in community pharmacy. Specific objectives included gaining knowledge about the circumstances surrounding attempts to implement CPS, and understanding relationships that are important to the change process. Organisational theory provided the conceptual framework for development of the qualitative research instrument, within which two theories were used to give insight into the change process: Borum's theory of organisational change, which categorizes change strategies as rational, natural, political or open; and Social Network Theory, which helps identify and explain the relationships between key people involved in the change process. A semi-structured affecting practice change found in the literature that warranted further investigation with the theoretical perspectives of organisational change and social networks. To address the research objectives, the instrument covered four broad themes: roles, experiences, strategies and networks. The qualitative research instrument developed in this study provides a

  5. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  6. Modelling the liquidity ratio as macroprudential instrument

    OpenAIRE

    Jan Willem van den End; Mark Kruidhof

    2012-01-01

    The Basel III Liquidity Coverage Ratio (LCR) is a microprudential instrument to strengthen the liquidity position of banks. However, if in extreme scenarios the LCR becomes a binding constraint, the interaction of bank behaviour with the regulatory rule can have negative externalities. We simulate the systemic implications of the LCR by a liquidity stress-testing model, which takes into account the impact of bank reactions on second round feedback effects. We show that a flexible approach of ...

  7. NASA Instrument Cost/Schedule Model

    Science.gov (United States)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  8. Lectures on algebraic model theory

    CERN Document Server

    Hart, Bradd

    2001-01-01

    In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.

  9. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  10. Warped models in string theory

    International Nuclear Information System (INIS)

    Acharya, B.S.; Benini, F.; Valandro, R.

    2006-12-01

    Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)

  11. Laser Speckle Contrast Imaging: theory, instrumentation and applications.

    Science.gov (United States)

    Senarathna, Janaka; Rege, Abhishek; Li, Nan; Thakor, Nitish V

    2013-01-01

    Laser Speckle Contrast Imaging (LSCI) is a wide field of view, non scanning optical technique for observing blood flow. Speckles are produced when coherent light scattered back from biological tissue is diffracted through the limiting aperture of focusing optics. Mobile scatterers cause the speckle pattern to blur; a model can be constructed by inversely relating the degree of blur, termed speckle contrast to the scatterer speed. In tissue, red blood cells are the main source of moving scatterers. Therefore, blood flow acts as a virtual contrast agent, outlining blood vessels. The spatial resolution (~10 μm) and temporal resolution (10 ms to 10 s) of LSCI can be tailored to the application. Restricted by the penetration depth of light, LSCI can only visualize superficial blood flow. Additionally, due to its non scanning nature, LSCI is unable to provide depth resolved images. The simple setup and non-dependence on exogenous contrast agents have made LSCI a popular tool for studying vascular structure and blood flow dynamics. We discuss the theory and practice of LSCI and critically analyze its merit in major areas of application such as retinal imaging, imaging of skin perfusion as well as imaging of neurophysiology.

  12. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  13. Instrumentation

    International Nuclear Information System (INIS)

    Umminger, K.

    2008-01-01

    A proper measurement of the relevant single and two-phase flow parameters is the basis for the understanding of many complex thermal-hydraulic processes. Reliable instrumentation is therefore necessary for the interaction between analysis and experiment especially in the field of nuclear safety research where postulated accident scenarios have to be simulated in experimental facilities and predicted by complex computer code systems. The so-called conventional instrumentation for the measurement of e. g. pressures, temperatures, pressure differences and single phase flow velocities is still a solid basis for the investigation and interpretation of many phenomena and especially for the understanding of the overall system behavior. Measurement data from such instrumentation still serves in many cases as a database for thermal-hydraulic system codes. However some special instrumentation such as online concentration measurement for boric acid in the water phase or for non-condensibles in steam atmosphere as well as flow visualization techniques were further developed and successfully applied during the recent years. Concerning the modeling needs for advanced thermal-hydraulic codes, significant advances have been accomplished in the last few years in the local instrumentation technology for two-phase flow by the application of new sensor techniques, optical or beam methods and electronic technology. This paper will give insight into the current state of instrumentation technology for safety-related thermohydraulic experiments. Advantages and limitations of some measurement processes and systems will be indicated as well as trends and possibilities for further development. Aspects of instrumentation in operating reactors will also be mentioned.

  14. [Health promotion. Instrument development for the application of the theory of planned behavior].

    Science.gov (United States)

    Lee, Y O

    1993-01-01

    The purpose of this article is to describe operationalization of the Theory of Planned Behavior (TPB). The quest to understand determinants of health behaviors has intensified as evidence accumulates concerning the impact of personal behavior on health. The majority of theory-based research has used the Health Belief Model(HBM). The HBM components have had limited success in explaining health-related behaviors. There are several advantages of the TPB over the HBM. TPB is an expansion of the Theory of Reasoned Action(TRA) with the addition of the construct, perceived behavioral control. The revised model has been shown to yield greater explanatory power than the original TRA for goal-directed behaviors. The process of TPB instrument development was described, using example form the study of smoking cessation behavior in military smokers. It was followed by a discussion of reliability and validity issues in operationalizing the TPB. The TPB is a useful model for understanding and predicting health-related behaviors when carefully operationalized. The model holds promise in the development of prescriptive nursing approaches.

  15. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  16. Psyche Mission: Scientific Models and Instrument Selection

    Science.gov (United States)

    Polanskey, C. A.; Elkins-Tanton, L. T.; Bell, J. F., III; Lawrence, D. J.; Marchi, S.; Park, R. S.; Russell, C. T.; Weiss, B. P.

    2017-12-01

    NASA has chosen to explore (16) Psyche with their 14th Discovery-class mission. Psyche is a 226-km diameter metallic asteroid hypothesized to be the exposed core of a planetesimal that was stripped of its rocky mantle by multiple hit and run collisions in the early solar system. The spacecraft launch is planned for 2022 with arrival at the asteroid in 2026 for 21 months of operations. The Psyche investigation has five primary scientific objectives: A. Determine whether Psyche is a core, or if it is unmelted material. B. Determine the relative ages of regions of Psyche's surface. C. Determine whether small metal bodies incorporate the same light elements as are expected in the Earth's high-pressure core. D. Determine whether Psyche was formed under conditions more oxidizing or more reducing than Earth's core. E. Characterize Psyche's topography. The mission's task was to select the appropriate instruments to meet these objectives. However, exploring a metal world, rather than one made of ice, rock, or gas, requires development of new scientific models for Psyche to support the selection of the appropriate instruments for the payload. If Psyche is indeed a planetary core, we expect that it should have a detectable magnetic field. However, the strength of the magnetic field can vary by orders of magnitude depending on the formational history of Psyche. The implications of both the extreme low-end and the high-end predictions impact the magnetometer and mission design. For the imaging experiment, what can the team expect for the morphology of a heavily impacted metal body? Efforts are underway to further investigate the differences in crater morphology between high velocity impacts into metal and rock to be prepared to interpret the images of Psyche when they are returned. Finally, elemental composition measurements at Psyche using nuclear spectroscopy encompass a new and unexplored phase space of gamma-ray and neutron measurements. We will present some end

  17. Introduction to focused ion beams instrumentation, theory, techniques and practice

    CERN Document Server

    Giannuzzi, Lucille A

    2005-01-01

    The focused ion beam (FIB) instrument has experienced an intensive period of maturation since its inception. Numerous new techniques and applications have been brought to fruition, and over the past few years, the FIB has gained acceptance as more than just an expensive sample preparation tool. It has taken its place among the suite of other instruments commonly available in analytical and forensic laboratories, universities, geological, medical and biological research institutions, and manufacturing plants. Although the utility of the FIB is not limited to the preparation of specimens for subsequent analysis by other analytical techniques, it has revolutionized the area of TEM specimen preparation. The FIB has also been used to prepare samples for numerous other analytical techniques, and offers a wide range of other capabilities. While the mainstream of FIB usage remains within the semiconductor industry, FIB usage has expanded to applications in metallurgy, ceramics, composites, polymers, geology, art, bio...

  18. OPTIMIZATION OF TAX REGIME USING THE INSTRUMENT OF GAME THEORY

    Directory of Open Access Journals (Sweden)

    Igor Yu. Pelevin

    2014-01-01

    Full Text Available The article is devoted to one of one possible mechanism of taxation optimization of agricultural enterprises where used the game theory. Use of this mechanism allows to apply the most optimal type of taxation that would benefit both a taxpayer and the government. In the article offered the definition of the tax storage and its possible applications.

  19. The Standard, Power, and Color Model of Instrument Combination in Romantic-Era Symphonic Works

    Directory of Open Access Journals (Sweden)

    Randolph Johnson

    2011-08-01

    Full Text Available The Standard, Power, and Color (SPC model describes the nexus between musical instrument combination patterns and expressive goals in music. Instruments within each SPC group tend to attract each other and work as a functional unit to create orchestral gestures. Standard instruments establish a timbral groundwork; Power instruments create contrast through loud dynamic climaxes; and Color instruments catch listeners’ attention by means of their sparing use. Examples within these three groups include violin (Standard, piccolo (Power, and harp (Color. The SPC theory emerges from analyses of nineteenth-century symphonic works. Multidimensional scaling analysis of instrument combination frequencies maps instrument relationships; hierarchical clustering analysis indicates three SPC groups within the map. The SPC characterization is found to be moderately robust through the results of hypothesis testing: (1 Color instruments are included less often in symphonic works; (2 when Color instruments are included, they perform less often than the average instrument; and (3 Color and non-Color instruments have equal numbers of solo occurrences. Additionally, (4 Power instruments are positively associated with louder dynamic levels; and (5 when Power instruments are present in the musical texture, the pitch range spanned by the entire orchestra does not become more extreme.

  20. Minisuperspace models in histories theory

    International Nuclear Information System (INIS)

    Anastopoulos, Charis; Savvidou, Ntina

    2005-01-01

    We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context

  1. Evaluation of multivariate calibration models transferred between spectroscopic instruments

    DEFF Research Database (Denmark)

    Eskildsen, Carl Emil Aae; Hansen, Per W.; Skov, Thomas

    2016-01-01

    In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions for the ......In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions...... for the same samples using the transferred model. However, sometimes the success of a model transfer is evaluated by comparing the transferred model predictions with the reference values. This is not optimal, as uncertainties in the reference method will impact the evaluation. This paper proposes a new method...... for calibration model transfer evaluation. The new method is based on comparing predictions from different instruments, rather than comparing predictions and reference values. A total of 75 flour samples were available for the study. All samples were measured on ten near infrared (NIR) instruments from two...

  2. Foundations of compositional model theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2011-01-01

    Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf

  3. Superfield theory and supermatrix model

    International Nuclear Information System (INIS)

    Park, Jeong-Hyuck

    2003-01-01

    We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)

  4. Latest NASA Instrument Cost Model (NICM): Version VI

    Science.gov (United States)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  5. Practical aspects of trapped ion mass spectrometry, 4 theory and instrumentation

    CERN Document Server

    March, Raymond E

    2010-01-01

    The expansion of the use of ion trapping in different areas of mass spectrometry and different areas of application indicates the value of a single source of information drawing together diverse inputs. This book provides an account of the theory and instrumentation of mass spectrometric applications and an introduction to ion trapping devices.

  6. Confucian "Creatio in Situ"--Philosophical Resource for a Theory of Creativity in Instrumental Music Education

    Science.gov (United States)

    Tan, Leonard

    2016-01-01

    In this philosophical essay, I propose a theory of creativity for instrumental music education inspired by Confucian "creatio in situ" ("situational creativity"). Through an analysis of three major texts from classical Confucianism--the "Analects," the "Zhongyong" ("Doctrine of the Mean"), and the…

  7. High School Instrumental Music Students' Attitudes and Beliefs regarding Practice: An Application of Attribution Theory

    Science.gov (United States)

    Schatt, Matthew D.

    2011-01-01

    The purpose of this study was to explore high school band students' perspectives of instrumental music practice from within the attribution theory paradigm and to attempt to elucidate the secondary student's attitudes toward practice. High school band students from three Midwestern school districts (N = 218) completed a survey that was used to…

  8. Instrumentation

    International Nuclear Information System (INIS)

    Prieur, G.; Nadi, M.; Hedjiedj, A.; Weber, S.

    1995-01-01

    This second chapter on instrumentation gives little general consideration on history and classification of instrumentation, and two specific states of the art. The first one concerns NMR (block diagram of instrumentation chain with details on the magnets, gradients, probes, reception unit). The first one concerns precision instrumentation (optical fiber gyro-meter and scanning electron microscope), and its data processing tools (programmability, VXI standard and its history). The chapter ends with future trends on smart sensors and Field Emission Displays. (D.L.). Refs., figs

  9. Realism, Instrumentalism, and Scientific Symbiosis: Psychological Theory as a search for truth and the discovery of solutions

    NARCIS (Netherlands)

    Cacioppo, J.T.; Semin, G.R.; Berntson, G.G.

    2004-01-01

    Scientific realism holds that scientific theories are approximations of universal truths about reality, whereas scientific instrumentalism posits that scientific theories are intellectual structures that provide adequate predictions of what is observed and useful frameworks for answering questions

  10. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2000-01-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised

  11. The social conditions of instrumental action: Problems in the sociological understanding of rational choice theory

    Directory of Open Access Journals (Sweden)

    Bruno Sciberras de Carvalho

    2008-01-01

    Full Text Available This article critically analyzes new sociological approaches to the rational choice theory which - beyond examining political or economic practices - link the notion of instrumental rationality to social issues and themes. The article begins by highlighting the issue of trust, indicating the functionality of certain social arrangements in collective problem-solving. The paper goes on to demonstrate that problems emerge with the theory when it attempts to explain the feasibility of social norms in impersonal, comprehensive contexts. Thus, the fundamental point that appears to be missing from rational choice theory is the perception that individual decisions and instrumental conduct itself incorporate dispositions that in a sense are beyond the actors' control.

  12. Models in cooperative game theory

    CERN Document Server

    Branzei, Rodica; Tijs, Stef

    2008-01-01

    This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.

  13. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2001-04-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor.

  14. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2001-01-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor

  15. Field theory and the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Dudas, E [Orsay, LPT (France)

    2014-07-01

    This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.

  16. Lattice models and conformal field theories

    International Nuclear Information System (INIS)

    Saleur, H.

    1988-01-01

    Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied

  17. Halo modelling in chameleon theories

    Energy Technology Data Exchange (ETDEWEB)

    Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  18. Halo modelling in chameleon theories

    International Nuclear Information System (INIS)

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu

    2014-01-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations

  19. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  20. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2002-01-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described

  1. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2002-04-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described.

  2. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2000-07-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised.

  3. Pinna Model for Hearing Instrument Applications

    DEFF Research Database (Denmark)

    Kammersgaard, Nikolaj Peter Iversen; Kvist, Søren Helstrup; Thaysen, Jesper

    2014-01-01

    A novel model of the pinna (outer ear) is presented. This is to increase the understanding of the effect of the pinna on the on-body radiation pattern of an antenna placed inside the ear. Simulations of the model and of a realistically shaped ear are compared to validate the model. The radiation...

  4. Quiver gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Yagi, Junya

    2015-01-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  5. Validation of an Instrument for Assessing Conceptual Change with Respect to the Theory of Evolution by Secondary Biology Students

    Science.gov (United States)

    Goff, Kevin David

    This pilot study evaluated the validity of a new quantitative, closed-response instrument for assessing student conceptual change regarding the theory of evolution. The instrument has two distinguishing design features. First, it is designed not only to gauge student mastery of the scientific model of evolution, but also to elicit a trio of deeply intuitive tendencies that are known to compromise many students' understanding: the projection of intentional agency, teleological directionality, and immutable essences onto biological phenomena. Second, in addition to a section of conventional multiple choice questions, the instrument contains a series of items where students may simultaneously endorse both scientifically normative propositions and intuitively appealing yet unscientific propositions, without having to choose between them. These features allow for the hypothesized possibility that the three intuitions are partly innate, themselves products of cognitive evolution in our hominin ancestors, and thus may continue to inform students' thinking even after instruction and conceptual change. The test was piloted with 340 high school students from diverse schools and communities. Confirmatory factor analysis and other statistical methods provided evidence that the instrument already has strong potential for validly distinguishing students who hold a correct scientific understanding from those who do not, but that revision and retesting are needed to render it valid for gauging students' adherence to intuitive misconceptions. Ultimately the instrument holds promise as a tool for classroom intervention studies by conceptual change researchers, for diagnostic testing and data gathering by instructional leaders, and for provoking classroom dialogue and debate by science teachers.

  6. Instruments

    International Nuclear Information System (INIS)

    Buehrer, W.

    1996-01-01

    The present paper mediates a basic knowledge of the most commonly used experimental techniques. We discuss the principles and concepts necessary to understand what one is doing if one performs an experiment on a certain instrument. (author) 29 figs., 1 tab., refs

  7. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz

    2017-01-01

    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  8. Developing and testing a positive theory of instrument choice: Renewable energy policy in the fifty American states

    Science.gov (United States)

    Ciocirlan, Cristina E.

    The environmental economics literature consistently suggests that properly designed and implemented economic incentives are superior to command-and-control regulation in reducing pollution. Economic incentives, such as green taxes, cap-and-trade programs, tax incentives, are able to reduce pollution in a cost-effective manner, provide flexibility to industry and stimulate innovation in cleaner technologies. In the past few decades, both federal and state governments have shown increased use of economic incentives in environmental policy. Some states have embraced them in an active manner, while others have failed to do so. This research uses a three-step analysis. First, it asks why some states employ more economic incentives than others to stimulate consumption of renewable energy by the residential, commercial and industrial sectors. Second, it asks why some states employ stronger incentives than others. And third, it asks why certain states employ certain instruments, such as electricity surcharges, cap-and-trade programs, tax incentives or grants, while others do not. The first two analyses were conducted using factor analysis and multiple regression analysis, while the third analysis employed logistic regression models to analyze the data. Data for all three analyses were obtained from a combination of primary and secondary sources. To address these questions, a theory of instrument choice at the state level, which includes both internal and external determinants of policy-making, was developed and tested. The state level of analysis was chosen. States have proven to be pioneers in designing policies to address greenhouse gases (see, for instance, the recent cap-and-trade legislation passed in California). The theory was operationalized with the help of four models: needs/responsiveness, interest group influence, professionalism/capacity and innovation-and-diffusion. The needs/responsiveness model suggests that states tend to choose more and stronger economic

  9. Galaxy Alignments: Theory, Modelling & Simulations

    Science.gov (United States)

    Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais

    2015-11-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.

  10. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  11. A Non-Parametric Item Response Theory Evaluation of the CAGE Instrument Among Older Adults.

    Science.gov (United States)

    Abdin, Edimansyah; Sagayadevan, Vathsala; Vaingankar, Janhavi Ajit; Picco, Louisa; Chong, Siow Ann; Subramaniam, Mythily

    2018-02-23

    The validity of the CAGE using item response theory (IRT) has not yet been examined in older adult population. This study aims to investigate the psychometric properties of the CAGE using both non-parametric and parametric IRT models, assess whether there is any differential item functioning (DIF) by age, gender and ethnicity and examine the measurement precision at the cut-off scores. We used data from the Well-being of the Singapore Elderly study to conduct Mokken scaling analysis (MSA), dichotomous Rasch and 2-parameter logistic IRT models. The measurement precision at the cut-off scores were evaluated using classification accuracy (CA) and classification consistency (CC). The MSA showed the overall scalability H index was 0.459, indicating a medium performing instrument. All items were found to be homogenous, measuring the same construct and able to discriminate well between respondents with high levels of the construct and the ones with lower levels. The item discrimination ranged from 1.07 to 6.73 while the item difficulty ranged from 0.33 to 2.80. Significant DIF was found for 2-item across ethnic group. More than 90% (CC and CA ranged from 92.5% to 94.3%) of the respondents were consistently and accurately classified by the CAGE cut-off scores of 2 and 3. The current study provides new evidence on the validity of the CAGE from the IRT perspective. This study provides valuable information of each item in the assessment of the overall severity of alcohol problem and the precision of the cut-off scores in older adult population.

  12. Instrumentation

    International Nuclear Information System (INIS)

    Muehllehner, G.; Colsher, J.G.

    1982-01-01

    This chapter reviews the parameters which are important to positron-imaging instruments. It summarizes the options which various groups have explored in designing tomographs and the methods which have been developed to overcome some of the limitations inherent in the technique as well as in present instruments. The chapter is not presented as a defense of positron imaging versus single-photon or other imaging modality, neither does it contain a description of various existing instruments, but rather stresses their common properties and problems. Design parameters which are considered are resolution, sampling requirements, sensitivity, methods of eliminating scattered radiation, random coincidences and attenuation. The implementation of these parameters is considered, with special reference to sampling, choice of detector material, detector ring diameter and shielding and variations in point spread function. Quantitation problems discussed are normalization, and attenuation and random corrections. Present developments mentioned are noise reduction through time-of-flight-assisted tomography and signal to noise improvements through high intrinsic resolution. Extensive bibliography. (U.K.)

  13. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  14. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2001-01-01

    , in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...

  15. Instrumentation of a prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Hessheimer, M.F.; Rightley, M.J.; Matsumoto, T.

    1995-01-01

    A series of static overpressurization tests of scale models of nuclear containment structures is being conducted by Sandia National Laboratories for the Nuclear Power Engineering Corporation of Japan and the U.S. Nuclear Regulatory Commission. At present, two tests are being planned: a test of a model of a steel containment vessel (SCV) that is representative of an improved, boiling water reactor (BWR) Mark II design; and a test of a model of a prestressed concrete containment vessel (PCCV). This paper discusses plans and the results of a preliminary investigation of the instrumentation of the PCCV model. The instrumentation suite for this model will consist of approximately 2000 channels of data to record displacements, strains in the reinforcing steel, prestressing tendons, concrete, steel liner and liner anchors, as well as pressure and temperature. The instrumentation is being designed to monitor the response of the model during prestressing operations, during Structural Integrity and Integrated Leak Rate testing, and during test to failure of the model. Particular emphasis has been placed on instrumentation of the prestressing system in order to understand the behavior of the prestressing strands at design and beyond design pressure levels. Current plans are to place load cells at both ends of one third of the tendons in addition to placing strain measurement devices along the length of selected tendons. Strain measurements will be made using conventional bonded foil resistance gages and a wire resistance gage, known as a open-quotes Tensmegclose quotes reg-sign gage, specifically designed for use with seven-wire strand. The results of preliminary tests of both types of gages, in the laboratory and in a simulated model configuration, are reported and plans for instrumentation of the model are discussed

  16. Developing a theory-based instrument to assess the impact of continuing professional development activities on clinical practice: a study protocol

    Directory of Open Access Journals (Sweden)

    Rousseau Michel

    2011-03-01

    Full Text Available Abstract Background Continuing professional development (CPD is one of the principal means by which health professionals (i.e. primary care physicians and specialists maintain, improve, and broaden the knowledge and skills required for optimal patient care and safety. However, the lack of a widely accepted instrument to assess the impact of CPD activities on clinical practice thwarts researchers' comparisons of the effectiveness of CPD activities. Using an integrated model for the study of healthcare professionals' behaviour, our objective is to develop a theory-based, valid, reliable global instrument to assess the impact of accredited CPD activities on clinical practice. Methods Phase 1: We will analyze the instruments identified in a systematic review of factors influencing health professionals' behaviours using criteria that reflect the literature on measurement development and CPD decision makers' priorities. The outcome of this phase will be an inventory of instruments based on social cognitive theories. Phase 2: Working from this inventory, the most relevant instruments and their related items for assessing the concepts listed in the integrated model will be selected. Through an e-Delphi process, we will verify whether these instruments are acceptable, what aspects need revision, and whether important items are missing and should be added. The outcome of this phase will be a new global instrument integrating the most relevant tools to fit our integrated model of healthcare professionals' behaviour. Phase 3: Two data collections are planned: (1 a test-retest of the new instrument, including item analysis, to assess its reliability and (2 a study using the instrument before and after CPD activities with a randomly selected control group to explore the instrument's mere-measurement effect. Phase 4: We will conduct individual interviews and focus groups with key stakeholders to identify anticipated barriers and enablers for implementing the

  17. Crisis in Context Theory: An Ecological Model

    Science.gov (United States)

    Myer, Rick A.; Moore, Holly B.

    2006-01-01

    This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…

  18. Study of parental models: building an instrument for their exploration

    Directory of Open Access Journals (Sweden)

    José Francisco Martínez Licona

    2014-08-01

    Full Text Available Objective: This research presents the construction of an attributional questionnaire concerning the different parental models and factors that are involved in family interactions. Method: A mixed methodology was used as a foundation to develop items and respective pilots that allowed checking the validity and internal consistency of the instrument using expert judgment. Results: An instrument of 36 statements was organized into 12 categories to explore the parental models according to the following factors: parental models, breeding patterns, attachment bonds and guidelines for success, and promoted inside family contexts. Analyzing these factors contributes to the children’s development within the familiar frown, and the opportunity for socio-educational intervention. Conclusion: It is assumed that the family context is as decisive as the school context; therefore, exploring the nature of parental models is required to understand the features and influences that contribute to the development of young people in any social context.

  19. Instrumentation and testing of a prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Hessheimer, M.F.; Pace, D.W.; Klamerus, E.W.

    1997-01-01

    Static overpressurization tests of two scale models of nuclear containment structures - a steel containment vessel (SCV) representative of an improved, boiling water reactor (BWR) Mark II design and a prestressed concrete containment vessel (PCCV) for pressurized water reactors (PWR) - are being conducted by Sandia National Laboratories for the Nuclear Power Engineering Corporation of Japan and the U.S. Nuclear Regulatory Commission. This paper discusses plans for instrumentation and testing of the PCCV model. 6 refs., 2 figs., 2 tabs

  20. Constraint theory multidimensional mathematical model management

    CERN Document Server

    Friedman, George J

    2017-01-01

    Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...

  1. Staircase Models from Affine Toda Field Theory

    CERN Document Server

    Dorey, P; Dorey, Patrick; Ravanini, Francesco

    1993-01-01

    We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.

  2. Reconstructing bidimensional scalar field theory models

    International Nuclear Information System (INIS)

    Flores, Gabriel H.; Svaiter, N.F.

    2001-07-01

    In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)

  3. A course on basic model theory

    CERN Document Server

    Sarbadhikari, Haimanti

    2017-01-01

    This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.

  4. Gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Witten, E.

    1989-01-01

    Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)

  5. Cluster model in reaction theory

    International Nuclear Information System (INIS)

    Adhikari, S.K.

    1979-01-01

    A recent work by Rosenberg on cluster states in reaction theory is reexamined and generalized to include energies above the threshold for breakup into four composite fragments. The problem of elastic scattering between two interacting composite fragments is reduced to an equivalent two-particle problem with an effective potential to be determined by extremum principles. For energies above the threshold for breakup into three or four composite fragments effective few-particle potentials are introduced and the problem is reduced to effective three- and four-particle problems. The equivalent three-particle equation contains effective two- and three-particle potentials. The effective potential in the equivalent four-particle equation has two-, three-, and four-body connected parts and a piece which has two independent two-body connected parts. In the equivalent three-particle problem we show how to include the effect of a weak three-body potential perturbatively. In the equivalent four-body problem an approximate simple calculational scheme is given when one neglects the four-particle potential the effect of which is presumably very small

  6. Cassini Radar EQM Model: Instrument Description and Performance Status

    Science.gov (United States)

    Borgarelli, L.; Faustini, E. Zampolini; Im, E.; Johnson, W. T. K.

    1996-01-01

    The spaeccraft of the Cassini Mission is planned to be launched towards Saturn in October 1997. The mission is designed to study the physical structure and chemical composition of Titan. The results of the tests performed on the Cassini radar engineering qualification model (EQM) are summarized. The approach followed in the verification and evaluation of the performance of the radio frequency subsystem EQM is presented. The results show that the instrument satisfies the relevant mission requirements.

  7. Economic Modelling in Institutional Economic Theory

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski

    2017-06-01

    Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.

  8. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  9. Axial vibrations of brass wind instrument bells and their acoustical influence: Theory and simulations.

    Science.gov (United States)

    Kausel, Wilfried; Chatziioannou, Vasileios; Moore, Thomas R; Gorman, Britta R; Rokni, Michelle

    2015-06-01

    Previous work has demonstrated that structural vibrations of brass wind instruments can audibly affect the radiated sound. Furthermore, these broadband effects are not explainable by assuming perfect coincidence of the frequency of elliptical structural modes with air column resonances. In this work a mechanism is proposed that has the potential to explain the broadband influences of structural vibrations on acoustical characteristics such as input impedance, transfer function, and radiated sound. The proposed mechanism involves the coupling of axial bell vibrations to the internal air column. The acoustical effects of such axial bell vibrations have been studied by extending an existing transmission line model to include the effects of a parasitic flow into vibrating walls, as well as distributed sound pressure sources due to periodic volume fluctuations in a duct with oscillating boundaries. The magnitude of these influences in typical trumpet bells, as well as in a complete instrument with an unbraced loop, has been studied theoretically. The model results in predictions of input impedance and acoustical transfer function differences that are approximately 1 dB for straight instruments and significantly higher when coiled tubes are involved or when very thin brass is used.

  10. An instrumental electrode model for solving EIT forward problems.

    Science.gov (United States)

    Zhang, Weida; Li, David

    2014-10-01

    An instrumental electrode model (IEM) capable of describing the performance of electrical impedance tomography (EIT) systems in the MHz frequency range has been proposed. Compared with the commonly used Complete Electrode Model (CEM), which assumes ideal front-end interfaces, the proposed model considers the effects of non-ideal components in the front-end circuits. This introduces an extra boundary condition in the forward model and offers a more accurate modelling for EIT systems. We have demonstrated its performance using simple geometry structures and compared the results with the CEM and full Maxwell methods. The IEM can provide a significantly more accurate approximation than the CEM in the MHz frequency range, where the full Maxwell methods are favoured over the quasi-static approximation. The improved electrode model will facilitate the future characterization and front-end design of real-world EIT systems.

  11. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  12. Topological quantum theories and integrable models

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.

    1991-01-01

    The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit

  13. Self Modeling: Expanding the Theories of Learning

    Science.gov (United States)

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  14. Security Theorems via Model Theory

    Directory of Open Access Journals (Sweden)

    Joshua Guttman

    2009-11-01

    Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.

  15. Realism, instrumentalism, and scientific symbiosis: psychological theory as a search for truth and the discovery of solutions.

    Science.gov (United States)

    Cacioppo, John T; Semin, Gün R; Berntson, Gary G

    2004-01-01

    Scientific realism holds that scientific theories are approximations of universal truths about reality, whereas scientific instrumentalism posits that scientific theories are intellectual structures that provide adequate predictions of what is observed and useful frameworks for answering questions and solving problems in a given domain. These philosophical perspectives have different strengths and weaknesses and have been regarded as incommensurate: Scientific realism fosters theoretical rigor, verifiability, parsimony, and debate, whereas scientific instrumentalism fosters theoretical innovation, synthesis, generativeness, and scope. The authors review the evolution of scientific realism and instrumentalism in psychology and propose that the categorical distinction between the 2 is overstated as a prescription for scientific practice. The authors propose that the iterative deployment of these 2 perspectives, just as the iterative application of inductive and deductive reasoning in science, may promote more rigorous, integrative, cumulative, and useful scientific theories.

  16. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  17. An instrument based on protection motivation theory to predict Chinese adolescents' intention to engage in protective behaviors against schistosomiasis.

    Science.gov (United States)

    Xiao, Han; Peng, Minjin; Yan, Hong; Gao, Mengting; Li, Jingjing; Yu, Bin; Wu, Hanbo; Li, Shiyue

    2016-01-01

    Further advancement in schistosomiasis prevention requires new tools to assess protective motivation, and promote innovative intervention program. This study aimed to develop and evaluate an instrument developed based on the Protection Motivation Theory (PMT) to predict protective behavior intention against schistosomiasis among adolescents in China. We developed the Schistosomiasis PMT Scale based on two appraisal pathways of protective motivation- threat appraisal pathway and coping appraisal pathway. Data from a large sample of middle school students ( n  = 2238, 51 % male, mean age 13.13 ± 1.10) recruited in Hubei, China was used to evaluated the validity and reliability of the scale. The final scale contains 18 items with seven sub-constructs. Cronbach's Alpha coefficients for the entire instrument was 0.76, and for the seven sub-constructs of severity, vulnerability, intrinsic reward, extrinsic reward, response efficacy, self-efficacy and response cost was 0.56, 0.82, 0.75, 0.80, 0.90, 0.72 and 0.70, respectively. The construct validity analysis revealed that the one level 7 sub-constructs model fitted data well (GFI = 0.98, CFI = 0.98, RMSEA = 0.03, Chi-sq/df = 3.90, p  motivation in schistosomiasis prevention control. Further studies are needed to develop more effective intervention programs for schistosomiasis prevention.

  18. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D

    2014-01-01

    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  19. On the algebraic theory of kink sectors: Application to quantum field theory models and collision theory

    International Nuclear Information System (INIS)

    Schlingemann, D.

    1996-10-01

    Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)

  20. Finite Difference Time Domain Modeling at USA Instruments, Inc.

    Science.gov (United States)

    Curtis, Richard

    2003-10-01

    Due to the competitive nature of the commercial MRI industry, it is essential for the financial health of a participating company to innovate new coil designs and bring product to market rapidly in response to ever-changing market conditions. However, the technology of MRI coil design is still early in its stage of development and its principles are yet evolving. As a result, it is not always possible to know the relevant electromagnetic effects of a given design since the interaction of coil elements is complex and often counter-intuitive. Even if the effects are known qualitatively, the quantitative results are difficult to obtain. At USA Instruments, Inc., the acquisition of the XFDTDâ electromagnetic simulation tool from REMCOM, Inc., has been helpful in determining the electromagnetic performance characteristics of existing coil designs in the prototype stage before the coils are released for production. In the ideal case, a coil design would be modeled earlier at the conceptual stage, so that only good designs will make it to the prototyping stage and the electromagnetic characteristics better understood very early in the design process and before the testing stage has begun. This paper is a brief overview of using FDTD modeling for MRI coil design at USA Instruments, Inc., and shows some of the highlights of recent FDTD modeling efforts on Birdcage coils, a staple of the MRI coil design portfolio.

  1. Introduction to zeolite theory and modelling

    NARCIS (Netherlands)

    Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.

    2001-01-01

    A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the

  2. Prospect Theory in the Heterogeneous Agent Model

    Czech Academy of Sciences Publication Activity Database

    Polach, J.; Kukačka, Jiří

    (2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economic s OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438.pdf

  3. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  4. Diagrammatic group theory in quark models

    International Nuclear Information System (INIS)

    Canning, G.P.

    1977-05-01

    A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de

  5. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  6. The dynamic model of choosing an external funding instrument

    Directory of Open Access Journals (Sweden)

    Irena HONKOVA

    2015-06-01

    Full Text Available Making a decision about using a specific funding source is one of the most important tasks of financial management. The utilization of external sources features numerous advantages yet staying aware of diverse funding options is not easy for financial managers. Today it is crucial to quickly identify an optimum possibility and to make sure that all relevant criteria have been considered and no variant has been omitted. Over the long term it is also necessary to consider the category of time as changes made today do not affect only the current variables but they also have a significant impact on the future. This article aims to identify the most suitable model of choosing external funding sources that would describe the dynamics involved. The first part of the paper considers the theoretical background of external funding instrument and of decision criteria. The making of financial decisions is a process consisted of weighing the most suitable variants, selecting the best variant, and controlling the implementation of accepted proposals. The second part analyses results of the research - decisive weights of the criteria. Then it is created the model of the principal criteria Weighted Average Cost of Capital (Dynamic model WACC. Finally it is created the Dynamic Model of Choosing an External Funding Instrument. The created decision-making model facilitates the modeling of changes in time because it is crucial to know what future consequences lies in decisions made the contemporary turbulent world. Each variant features possible negative and positive changes of varying extent. The possibility to simulate these changes can illustrate an optimal variant to a decision-maker.

  7. Development of a simple 12-item theory-based instrument to assess the impact of continuing professional development on clinical behavioral intentions.

    Directory of Open Access Journals (Sweden)

    France Légaré

    Full Text Available Decision-makers in organizations providing continuing professional development (CPD have identified the need for routine assessment of its impact on practice. We sought to develop a theory-based instrument for evaluating the impact of CPD activities on health professionals' clinical behavioral intentions.Our multipronged study had four phases. 1 We systematically reviewed the literature for instruments that used socio-cognitive theories to assess healthcare professionals' clinically-oriented behavioral intentions and/or behaviors; we extracted items relating to the theoretical constructs of an integrated model of healthcare professionals' behaviors and removed duplicates. 2 A committee of researchers and CPD decision-makers selected a pool of items relevant to CPD. 3 An international group of experts (n = 70 reached consensus on the most relevant items using electronic Delphi surveys. 4 We created a preliminary instrument with the items found most relevant and assessed its factorial validity, internal consistency and reliability (weighted kappa over a two-week period among 138 physicians attending a CPD activity. Out of 72 potentially relevant instruments, 47 were analyzed. Of the 1218 items extracted from these, 16% were discarded as improperly phrased and 70% discarded as duplicates. Mapping the remaining items onto the constructs of the integrated model of healthcare professionals' behaviors yielded a minimum of 18 and a maximum of 275 items per construct. The partnership committee retained 61 items covering all seven constructs. Two iterations of the Delphi process produced consensus on a provisional 40-item questionnaire. Exploratory factorial analysis following test-retest resulted in a 12-item questionnaire. Cronbach's coefficients for the constructs varied from 0.77 to 0.85.A 12-item theory-based instrument for assessing the impact of CPD activities on health professionals' clinical behavioral intentions showed adequate validity and

  8. Flyover Modeling of Planetary Pits - Undergraduate Student Instrument Project

    Science.gov (United States)

    Bhasin, N.; Whittaker, W.

    2015-12-01

    On the surface of the moon and Mars there are hundreds of skylights, which are collapsed holes that are believed to lead to underground caves. This research uses Vision, Inertial, and LIDAR sensors to build a high resolution model of a skylight as a landing vehicle flies overhead. We design and fabricate a pit modeling instrument to accomplish this task, implement software, and demonstrate sensing and modeling capability on a suborbital reusable launch vehicle flying over a simulated pit. Future missions on other planets and moons will explore pits and caves, led by the technology developed by this research. Sensor software utilizes modern graph-based optimization techniques to build 3D models using camera, LIDAR, and inertial data. The modeling performance was validated with a test flyover of a planetary skylight analog structure on the Masten Xombie sRLV. The trajectory profile closely follows that of autonomous planetary powered descent, including translational and rotational dynamics as well as shock and vibration. A hexagonal structure made of shipping containers provides a terrain feature that serves as an appropriate analog for the rim and upper walls of a cylindrical planetary skylight. The skylight analog floor, walls, and rim are modeled in elevation with a 96% coverage rate at 0.25m2 resolution. The inner skylight walls have 5.9cm2 color image resolution and the rims are 6.7cm2 with measurement precision superior to 1m. The multidisciplinary student team included students of all experience levels, with backgrounds in robotics, physics, computer science, systems, mechanical and electrical engineering. The team was commited to authentic scientific experimentation, and defined specific instrument requirements and measurable experiment objectives to verify successful completion.This work was made possible by the NASA Undergraduate Student Instrument Project Educational Flight Opportunity 2013 program. Additional support was provided by the sponsorship of an

  9. Application of a model of instrumental conditioning to mobile robot control

    Science.gov (United States)

    Saksida, Lisa M.; Touretzky, D. S.

    1997-09-01

    Instrumental conditioning is a psychological process whereby an animal learns to associate its actions with their consequences. This type of learning is exploited in animal training techniques such as 'shaping by successive approximations,' which enables trainers to gradually adjust the animal's behavior by giving strategically timed reinforcements. While this is similar in principle to reinforcement learning, the real phenomenon includes many subtle effects not considered in the machine learning literature. In addition, a good deal of domain information is utilized by an animal learning a new task; it does not start from scratch every time it learns a new behavior. For these reasons, it is not surprising that mobile robot learning algorithms have yet to approach the sophistication and robustness of animal learning. A serious attempt to model instrumental learning could prove fruitful for improving machine learning techniques. In the present paper, we develop a computational theory of shaping at a level appropriate for controlling mobile robots. The theory is based on a series of mechanisms for 'behavior editing,' in which pre-existing behaviors, either innate or previously learned, can be dramatically changed in magnitude, shifted in direction, or otherwise manipulated so as to produce new behavioral routines. We have implemented our theory on Amelia, an RWI B21 mobile robot equipped with a gripper and color video camera. We provide results from training Amelia on several tasks, all of which were constructed as variations of one innate behavior, object-pursuit.

  10. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  11. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  12. A dynamical theory for the Rishon model

    International Nuclear Information System (INIS)

    Harari, H.; Seiberg, N.

    1980-09-01

    We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)

  13. Polyacetylene and relativistic field-theory models

    International Nuclear Information System (INIS)

    Bishop, A.R.; Campbell, D.K.; Fesser, K.

    1981-01-01

    Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed

  14. Instrumental and ethical aspects of experimental research with animal models

    Directory of Open Access Journals (Sweden)

    Mirian Watanabe

    2014-02-01

    Full Text Available Experimental animal models offer possibilities of physiology knowledge, pathogenesis of disease and action of drugs that are directly related to quality nursing care. This integrative review describes the current state of the instrumental and ethical aspects of experimental research with animal models, including the main recommendations of ethics committees that focus on animal welfare and raises questions about the impact of their findings in nursing care. Data show that, in Brazil, the progress in ethics for the use of animals for scientific purposes was consolidated with Law No. 11.794/2008 establishing ethical procedures, attending health, genetic and experimental parameters. The application of ethics in handling of animals for scientific and educational purposes and obtaining consistent and quality data brings unquestionable contributions to the nurse, as they offer subsidies to relate pathophysiological mechanisms and the clinical aspect on the patient.

  15. Quantifying and handling errors in instrumental measurements using the measurement error theory

    DEFF Research Database (Denmark)

    Andersen, Charlotte Møller; Bro, R.; Brockhoff, P.B.

    2003-01-01

    . This is a new way of using the measurement error theory. Reliability ratios illustrate that the models for the two fish species are influenced differently by the error. However, the error seems to influence the predictions of the two reference measures in the same way. The effect of using replicated x...... measurements. A new general formula is given for how to correct the least squares regression coefficient when a different number of replicated x-measurements is used for prediction than for calibration. It is shown that the correction should be applied when the number of replicates in prediction is less than...

  16. Motivation for Instrument Education: A Study from the Perspective of Expectancy-Value and Flow Theories

    Science.gov (United States)

    Burak, Sabahat

    2014-01-01

    Problem Statement: In the process of instrument education, students being unwilling (lacking motivation) to play an instrument or to practise is a problem that educators frequently face. Recognizing the factors motivating the students will yield useful results for instrument educators in terms of developing correct teaching methods and approaches.…

  17. Working memory: theories, models, and controversies.

    Science.gov (United States)

    Baddeley, Alan

    2012-01-01

    I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.

  18. Effective field theory and the quark model

    International Nuclear Information System (INIS)

    Durand, Loyal; Ha, Phuoc; Jaczko, Gregory

    2001-01-01

    We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections

  19. Topos models for physics and topos theory

    International Nuclear Information System (INIS)

    Wolters, Sander

    2014-01-01

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos

  20. Prospects for advanced RF theory and modeling

    International Nuclear Information System (INIS)

    Batchelor, D. B.

    1999-01-01

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed. (c) 1999 American Institute of Physics

  1. A Membrane Model from Implicit Elasticity Theory

    Science.gov (United States)

    Freed, A. D.; Liao, J.; Einstein, D. R.

    2014-01-01

    A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079

  2. Attribution models and the Cooperative Game Theory

    OpenAIRE

    Cano Berlanga, Sebastian; Vilella, Cori

    2017-01-01

    The current paper studies the attribution model used by Google Analytics. Precisely, we use the Cooperative Game Theory to propose a fair distribution of the revenues among the considered channels, in order to facilitate the cooperation and to guarantee stability. We define a transferable utility convex cooperative game from the observed frequencies and we use the Shapley value to allocate the revenues among the di erent channels. Furthermore, we evaluate the impact of an advertising...

  3. Exploratory and confirmatory factor analysis of the Adolescent Motivation to Cook Questionnaire: A Self-Determination Theory instrument.

    Science.gov (United States)

    Miketinas, Derek; Cater, Melissa; Bailey, Ariana; Craft, Brittany; Tuuri, Georgianna

    2016-10-01

    Increasing adolescents' motivation and competence to cook may improve diet quality and reduce the risk for obesity and chronic diseases. The objective of this study was to develop an instrument to measure adolescents' intrinsic motivation to prepare healthy foods and the four psychological needs that facilitate motivation identified by the Self Determination Theory (SDT). Five hundred ninety-three high school students (62.7% female) were recruited to complete the survey. Participants indicated to what extent they agreed or disagreed with 25 statements pertaining to intrinsic motivation and perceived competence to cook, and their perceived autonomy support, autonomy, and relatedness to teachers and classmates. Data were analyzed using exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and internal consistency reliability. EFA returned a five-factor structure explaining 65.3% of the variance; and CFA revealed that the best model fit was a five-factor structure (χ2 = 524.97 (265); Comparative Fit Index = 0.93; RMSEA = 0.056; and SRMR = 0.04). The sub-scales showed good internal consistency (Intrinsic Motivation: α = 0.94; Perceived Competence: α = 0.92; Autonomy Support: α = 0.94; Relatedness: α = 0.90; and Autonomy: α = 0.85). These results support the application of the Adolescent Motivation to Cook Questionnaire to measure adolescents' motivation and perceived competence to cook, autonomy support by their instructor, autonomy in the classroom, and relatedness to peers. Further studies are needed to investigate whether this instrument can measure change in cooking intervention programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. MODELS AND THE DYNAMICS OF THEORIES

    Directory of Open Access Journals (Sweden)

    Paulo Abrantes

    2007-12-01

    Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.

  5. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    Science.gov (United States)

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  6. Out- and insourcing, an analysis model for use of instrumented techniques

    DEFF Research Database (Denmark)

    Bang, Henrik Peter; Grønbæk, Niels; Larsen, Claus Richard

    2017-01-01

    We sketch an outline of a model for analyzing the use of ICT-tools, in particular CAS, in teaching designs employed by ‘generic’ teachers. Our model uses the business economics concepts out- and insourcing as metaphors within the dialectics of tool and content in planning of teaching. Outsourcing...... is done in order to enhance outcome through external partners. The converse concept of insourcing refers to internal sourcing. We shall adhere to the framework of the anthropological theory of the didactic, viewing out- and insourcing primarily as decisions about the technology component of praxeologies....... We use the model on a concrete example from Danish upper secondary mathematics to uncover what underlies teachers’ decisions (deliberate or colloquial) on incorporating instrumented approaches....

  7. Exploring mouthfeel in model wines: Sensory-to-instrumental approaches.

    Science.gov (United States)

    Laguna, Laura; Sarkar, Anwesha; Bryant, Michael G; Beadling, Andrew R; Bartolomé, Begoña; Victoria Moreno-Arribas, M

    2017-12-01

    Wine creates a group of oral-tactile stimulations not related to taste or aroma, such as astringency or fullness; better known as mouthfeel. During wine consumption, mouthfeel is affected by ethanol content, phenolic compounds and their interactions with the oral components. Mouthfeel arises through changes in the salivary film when wine is consumed. In order to understand the role of each wine component, eight different model wines with/without ethanol (8%), glycerol (10g/L) and commercial tannins (1g/L) were described using a trained panel. Descriptive analysis techniques were used to train the panel and measure the intensity of the mouthfeel attributes. Alongside, the suitability of different instrumental techniques (rheology, particle size, tribology and microstructure, using Transmission Electron Microscopy (TEM)) to measure wine mouthfeel sensation was investigated. Panelists discriminated samples based on their tactile-related components (ethanol, glycerol and tannins) at the levels found naturally in wine. Higher scores were found for all sensory attributes in the samples containing ethanol. Sensory astringency was associated mainly with the addition of tannins to the wine model and glycerol did not seem to play a discriminating role at the levels found in red wines. Visual viscosity was correlated with instrumental viscosity (R=0.815, p=0.014). Hydrodynamic diameter of saliva showed an increase in presence of tannins (almost 2.5-3-folds). However, presence of ethanol or glycerol decreased hydrodynamic diameter. These results were related with the sensory astringency and earthiness as well as with the formation of nano-complexes as observed by TEM. Rheologically, the most viscous samples were those containing glycerol or tannins. Tribology results showed that at a boundary lubrication regime, differences in traction coefficient lubrication were due by the presence of glycerol. However, no differences in traction coefficients were observed in presence

  8. Finite Unification: Theory, Models and Predictions

    CERN Document Server

    Heinemeyer, S; Zoupanos, G

    2011-01-01

    All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...

  9. Sound production in recorder-like instruments : II. a simulation model

    NARCIS (Netherlands)

    Verge, M.P.; Hirschberg, A.; Causse, R.

    1997-01-01

    A simple one-dimensional representation of recorderlike instruments, that can be used for sound synthesis by physical modeling of flutelike instruments, is presented. This model combines the effects on the sound production by the instrument of the jet oscillations, vortex shedding at the edge of the

  10. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  11. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  12. NASA Instrument Cost Model for Explorer-Like Mission Instruments (NICM-E)

    Science.gov (United States)

    Habib-Agahi, Hamid; Fox, George; Mrozinski, Joe; Ball, Gary

    2013-01-01

    NICM-E is a cost estimating relationship that supplements the traditional NICM System Level CERs for instruments flown on NASA Explorer-like missions that have the following three characteristics: 1) fly on Class C missions, 2) major development led and performed by universities or research foundations, and 3) have significant level of inheritance.

  13. σ-models and string theories

    International Nuclear Information System (INIS)

    Randjbar-Daemi, S.

    1987-01-01

    The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix

  14. Modeling of Zakat in the capital structure theory | Sanusi | Journal of ...

    African Journals Online (AJOL)

    Islamic financial instruments are subject to taxes and zakat for Muslim shareholders and debt holders. Therefore, it is important to investigate the implementation of corporate taxes and corporate zakat in capital structure compositions. In order to model corporate zakat in terms of conventional capital structure theories, this ...

  15. A Transcultural Theory of Thinking for Instrumental Music Education: Philosophical Insights from Confucius and Dewey

    Science.gov (United States)

    Tan, Leonard

    2016-01-01

    In music education, thinking is often construed in terms of acquiring conceptual knowledge of musical elements. Research has found, however, that instrumental music educators have largely neglected conceptual teaching and learning. This begs the following questions: What is the nature of thinking in instrumental music education? How should…

  16. Instrument Response Modeling and Simulation for the GLAST Burst Monitor

    International Nuclear Information System (INIS)

    Kippen, R. M.; Hoover, A. S.; Wallace, M. S.; Pendleton, G. N.; Meegan, C. A.; Fishman, G. J.; Wilson-Hodge, C. A.; Kouveliotou, C.; Lichti, G. G.; Kienlin, A. von; Steinle, H.; Diehl, R.; Greiner, J.; Preece, R. D.; Connaughton, V.; Briggs, M. S.; Paciesas, W. S.; Bhat, P. N.

    2007-01-01

    The GLAST Burst Monitor (GBM) is designed to provide wide field of view observations of gamma-ray bursts and other fast transient sources in the energy range 10 keV to 30 MeV. The GBM is composed of several unshielded and uncollimated scintillation detectors (twelve NaI and two BGO) that are widely dispersed about the GLAST spacecraft. As a result, reconstructing source locations, energy spectra, and temporal properties from GBM data requires detailed knowledge of the detectors' response to both direct radiation as well as that scattered from the spacecraft and Earth's atmosphere. This full GBM instrument response will be captured in the form of a response function database that is derived from computer modeling and simulation. The simulation system is based on the GEANT4 Monte Carlo radiation transport simulation toolset, and is being extensively validated against calibrated experimental GBM data. We discuss the architecture of the GBM simulation and modeling system and describe how its products will be used for analysis of observed GBM data. Companion papers describe the status of validating the system

  17. An instrument dedicated for modelling of pulmonary radiotherapy

    International Nuclear Information System (INIS)

    Niezink, Anne G.H.; Dollekamp, Nienke J.; Elzinga, Harriet J.; Borger, Denise; Boer, Eduard J.H.; Ubbels, Jan F.; Woltman-van Iersel, Marleen; Leest, Annija H.D. van der; Beijert, Max; Groen, Harry J.M.; Kraan, Jan; Hiltermann, Thijo J.N.; Wekken, Anthonie J. van der; Putten, John W.G. van; Rutgers, Steven R.; Pieterman, Remge M.; Hosson, Sander M. de; Roenhorst, Anke W.J.; Langendijk, Johannes A.; Widder, Joachim

    2015-01-01

    Background and purpose: Radiotherapy plays a pivotal role in lung cancer treatment. Selection of patients for new (radio)therapeutic options aiming at improving outcomes requires reliable and validated prediction models. We present the implementation of a prospective platform for evaluation and development of lung radiotherapy (proPED-LUNG) as an instrument enabling multidimensional predictive modelling. Materials and methods: ProPED-LUNG was designed to comprise relevant baseline and follow up data of patients receiving pulmonary radiotherapy with curative intent. Patient characteristics, diagnostic and staging information, treatment parameters including full dose–volume-histograms, tumour control, survival, and toxicity are scored. Besides physician-rated data, a range of patient-rated data regarding symptoms and health-related quality-of-life are collected. Results: After 18 months of accrual, 315 patients have been included (accrual rate, 18 per month). Of the first hundred patients included, 70 received conformal (chemo)radiotherapy and 30 underwent stereotactic radiotherapy. Compliance at 3 and 6 months follow-up was 96–100% for patient-rated, and 81–94% for physician-rated assessments. For data collection, 0.4 FTE were allocated in a 183 FTE department (0.2%). Conclusions: ProPED-LUNG is feasible with high compliance rates and yields a large amount of high quality prospective disease-related, treatment-related, patient- and physician-rated data which can be used to evaluate new developments in pulmonary radiotherapy

  18. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....

  19. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....

  20. Quantum integrable models of field theory

    International Nuclear Information System (INIS)

    Faddeev, L.D.

    1979-01-01

    Fundamental features of the classical method of the inverse problem have been formulated in the form which is convenient for its quantum reformulation. Typical examples are studied which may help to formulate the quantum method of the inverse problem. Examples are considered for interaction with both attraction and repulsion at a final density. The sine-Gordon model and the XYZ model from the quantum theory of magnetics are examined in short. It is noted that all the achievements of the one-dimensional mathematical physics as applied to exactly solvable quantum models may be put to an extent within the framework of the quantum method of the inverse problem. Unsolved questions are enumerated and perspectives of applying the inverse problem method are shown

  1. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.

    1986-01-01

    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions....

  2. Economic contract theory tests models of mutualism.

    Science.gov (United States)

    Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E

    2010-09-07

    Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.

  3. Magnetic flux tube models in superstring theory

    CERN Document Server

    Russo, Jorge G

    1996-01-01

    Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...

  4. Group theory for unified model building

    International Nuclear Information System (INIS)

    Slansky, R.

    1981-01-01

    The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)

  5. A matrix model from string field theory

    Directory of Open Access Journals (Sweden)

    Syoji Zeze

    2016-09-01

    Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.

  6. On low rank classical groups in string theory, gauge theory and matrix models

    International Nuclear Information System (INIS)

    Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun

    2004-01-01

    We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature

  7. Application of Chaos Theory to Psychological Models

    Science.gov (United States)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  8. PARFUME Theory and Model basis Report

    Energy Technology Data Exchange (ETDEWEB)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  9. PRAGMATIST MODEL OF DECISION-MAKING IN LAW: FROM THE INSTRUMENTAL MENTALISM TO THE COMMUNICATIVE INTERSUBJECTIVITY

    Directory of Open Access Journals (Sweden)

    Mário Cesar da Silva Andrade

    2015-12-01

    Full Text Available This paper aimed to evaluate the method of making rational decision derived from the philosophy of Kant as a foundation paradigma of public decisions and, more specifically, of legal decisions. Based on the communicative action theory of Jürgen Habermas, the question is  if  the  transcendental  model  of  decision-making  meets  the  democratic  demands. Methodologically, the qualitative research was based on doctrinal sources about the theme, promoting a legal and critical analysis. Habermas' communicative bias raises the hypothesis that Kant's transcendental method, which influenced so much the theory of justice and Law, entails the adoption of an objective posture by the decision maker, something incompatible with the need for broad participation and the intersubjectivity prescribed by democracy . It was concluded that the public decision-making process must overcome the transcendental, decisionistic  and  instrumental  models,  adopting  pragmatic  model,  which  is  more intersubjective and communicative, therefore more consistente with the participatory bias of democracy.

  10. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  11. Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact

    Science.gov (United States)

    McQuillin, Samuel D.; Lyons, Michael D.

    2016-01-01

    This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…

  12. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2017-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  13. Theory and modelling of nanocarbon phase stability.

    Energy Technology Data Exchange (ETDEWEB)

    Barnard, A. S.

    2006-01-01

    The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.

  14. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2015-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  15. Dynamic Models of Instruments Using Rotating Unbalanced Masses

    Science.gov (United States)

    Hung, John Y.; Gallaspy, Jason M.; Bishop, Carlee A.

    1998-01-01

    The motion of telescopes, satellites, and other flight bodies have been controlled by various means in the past. For example, gimbal mounted devices can use electric motors to produce pointing and scanning motions. Reaction wheels, control moment gyros, and propellant-charged reaction jets are other technologies that have also been used. Each of these methods has its advantages, but all actuator systems used in a flight environment face the challenges of minimizing weight, reducing energy consumption, and maximizing reliability. Recently, Polites invented and patented the Rotating Unbalanced Mass (RUM) device as a means for generation scanning motion on flight experiments. RUM devices together with traditional servomechanisms have been successfully used to generate various scanning motions: linear, raster, and circular. The basic principle can be described: A RUM rotating at constant angular velocity exerts a cyclic centrifugal force on the instrument or main body, thus producing a periodic scanning motion. A system of RUM devices exerts no reaction forces on the main body, requires very little energy to rotate the RUMS, and is simple to construct. These are significant advantages over electric motors, reaction wheels, and control moment gyroscopes. Although the RUM device very easily produces scanning motion, an auxiliary control system has been required to maintain the proper orientation, or pointing of the main body. It has been suggested that RUM devices can be used to control pointing dynamics, as well as generate the desired periodic scanning motion. The idea is that the RUM velocity will not be kept constant, but will vary over the period of one RUM rotation. The thought is that the changing angular velocity produces a centrifugal force having time-varying magnitude and direction. The scope of this ongoing research project is to study the pointing control concept, and recommend a direction of study for advanced pointing control using only RUM devices. This

  16. Game Theory and its Relationship with Linear Programming Models ...

    African Journals Online (AJOL)

    Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.

  17. Hosotani model in closed string theory

    International Nuclear Information System (INIS)

    Shiraishi, Kiyoshi.

    1988-11-01

    Hosotani mechanism in the closed string theory with current algebra symmetry is described by the (old covariant) operator method. We compare the gauge symmetry breaking mechanism in a string theory which has SU(2) symmetry with the one in an equivalent compactified closed string theory. We also investigate the difference between Hosotani mechanism and Higgs mechanism in closed string theories by calculation of a fourpoint amplitude of 'Higgs' bosons at tree level. (author)

  18. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....

  19. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  20. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  1. Irreducible integrable theories form tensor products of conformal models

    International Nuclear Information System (INIS)

    Mathur, S.D.; Warner, N.P.

    1991-01-01

    By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)

  2. Warped Linear Prediction of Physical Model Excitations with Applications in Audio Compression and Instrument Synthesis

    Science.gov (United States)

    Glass, Alexis; Fukudome, Kimitoshi

    2004-12-01

    A sound recording of a plucked string instrument is encoded and resynthesized using two stages of prediction. In the first stage of prediction, a simple physical model of a plucked string is estimated and the instrument excitation is obtained. The second stage of prediction compensates for the simplicity of the model in the first stage by encoding either the instrument excitation or the model error using warped linear prediction. These two methods of compensation are compared with each other, and to the case of single-stage warped linear prediction, adjustments are introduced, and their applications to instrument synthesis and MPEG4's audio compression within the structured audio format are discussed.

  3. 2PI effective action for the SYK model and tensor field theories

    Science.gov (United States)

    Benedetti, Dario; Gurau, Razvan

    2018-05-01

    We discuss the two-particle irreducible (2PI) effective action for the SYK model and for tensor field theories. For the SYK model the 2PI effective action reproduces the bilocal reformulation of the model without using replicas. In general tensor field theories the 2PI formalism is the only way to obtain a bilocal reformulation of the theory, and as such is a precious instrument for the identification of soft modes and for possible holographic interpretations. We compute the 2PI action for several models, and push it up to fourth order in the 1 /N expansion for the model proposed by Witten in [1], uncovering a one-loop structure in terms of an auxiliary bilocal action.

  4. Instrumental variables estimation under a structural Cox model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Nørbo Sørensen, Ditte; Vansteelandt, Stijn

    2017-01-01

    Instrumental variable (IV) analysis is an increasingly popular tool for inferring the effect of an exposure on an outcome, as witnessed by the growing number of IV applications in epidemiology, for instance. The majority of IV analyses of time-to-event endpoints are, however, dominated by heurist...

  5. From Landau's hydrodynamical model to field theory model to field theory models of multiparticle production: a tribute to Peter

    International Nuclear Information System (INIS)

    Cooper, F.

    1996-01-01

    We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations

  6. Chaos Theory as a Model for Managing Issues and Crises.

    Science.gov (United States)

    Murphy, Priscilla

    1996-01-01

    Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…

  7. Catastrophe Theory: A Unified Model for Educational Change.

    Science.gov (United States)

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  8. A Leadership Identity Development Model: Applications from a Grounded Theory

    Science.gov (United States)

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.

    2006-01-01

    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  9. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  10. Theory and modeling of active brazing.

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  11. Domain Theory, Its Models and Concepts

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt

    2014-01-01

    Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contribution...

  12. Flute-like musical instruments: A toy model investigated through numerical continuation

    Science.gov (United States)

    Terrien, Soizic; Vergez, Christophe; Fabre, Benoît

    2013-07-01

    Self-sustained musical instruments (bowed string, woodwind and brass instruments) can be modelled by nonlinear lumped dynamical systems. Among these instruments, flutes and flue organ pipes present the particularity to be modelled as a delay dynamical system. In this paper, such a system, a toy model of flute-like instruments, is studied using numerical continuation. Equilibrium and periodic solutions are explored with respect to the blowing pressure, with focus on amplitude and frequency evolutions along the different solution branches, as well as "jumps" between periodic solution branches. The influence of a second model parameter (namely the inharmonicity) on the behaviour of the system is addressed. It is shown that harmonicity plays a key role in the presence of hysteresis or quasiperiodic regime. Throughout the paper, experimental results on a real instrument are presented to illustrate various phenomena, and allow some qualitative comparisons with numerical results.

  13. The contribution of process tracing to theory-based evaluations of complex aid instruments

    DEFF Research Database (Denmark)

    Beach, Derek; Schmitt, Johannes

    2015-01-01

    studies in demanding settings. For the specific task of evaluating the governance effectiveness of budget support interventions, we developed a more fine-grained causal mechanism for a subset of the comprehensive program theory of budget support. Moreover, based on the informal use of Bayesian logic, we...... remedy some of the problems at hand in much case-study research and increase the inferential leverage in complex within-case evaluation studies....

  14. From theory based policy evaluation to SMART Policy Design: Lessons learned from 20 ex-post evaluations of energy efficiency instruments

    International Nuclear Information System (INIS)

    Harmelink, Mirjam; Harmsen, Robert; Nilsson, Lars

    2007-01-01

    This article presents the results of an in-depth ex-post analysis of 20 energy efficiency policy instruments applied across different sectors and countries. Within the AID-EE project, we reconstructed and analysed the implementation process of energy efficiency policy instruments with the aim to identify key factors behind successes and failures. The analysis was performed using a uniform methodology called 'theory based policy evaluation'. With this method the whole implementation process is assessed with the aim to identify: (i) the main hurdles in each step of the implementation process, (ii) key success factors for different types of instruments and (iii) the key indicators that need to be monitored to enable a sound evaluation of the energy efficiency instruments. Our analysis shows that: Energy efficiency policies often lack quantitative targets and clear timeframes; Often policy instruments have multiple and/or unclear objectives; The need for monitoring information does often not have priority in the design phase; For most instruments, monitoring information is collected on a regular basis. However, this information is often insufficient to determine the impact on energy saving, cost-effectiveness and target achievement of an instrument; Monitoring and verification of actual energy savings have a relatively low priority for most of the analyzed instruments. There is no such thing as the 'best' policy instrument. However, typical circumstances in which to apply different types of instruments and generic characteristics that determine success or failure can be identified. Based on the assessments and the experience from applying theory based policy evaluation ex-post, we suggest that this should already be used in the policy formulation and design phase of instruments. We conclude that making policy theory an integral and mandated part of the policy process would facilitate more efficient and effective energy efficiency instruments

  15. Big bang models in string theory

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Theoretische Natuurkunde, Vrije Universiteit Brussel and The International Solvay Institutes Pleinlaan 2, B-1050 Brussels (Belgium)

    2006-11-07

    These proceedings are based on lectures delivered at the 'RTN Winter School on Strings, Supergravity and Gauge Theories', CERN, 16-20 January 2006. The school was mainly aimed at PhD students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne orbifold and the matrix big bang.

  16. Neutron and synchrotron radiation for condensed matter studies. Volume 1: theory, instruments and methods

    International Nuclear Information System (INIS)

    Baruchel, J.; Hodeau, J.L.; Lehmann, M.S.; Regnard, J.R.; Schlenker, C.

    1993-01-01

    This book provides the basic information required by a research scientist wishing to undertake studies using neutrons or synchrotron radiation at a Large Facility. These lecture notes result from 'HERCULES', a course that has been held in Grenoble since 1991 to train young scientists in these fields. They cover the production of neutrons and synchrotron radiation and describe all aspects of instrumentation. In addition, this work outlines the basics of the various fields of research pursued at these Large Facilities. It consists of a series of chapters written by experts in the particular fields. While following a progression and constituting a lecture course on neutron and x-ray scattering, these chapters can also be read independently. This first volume will be followed by two further volumes concerned with the applications to solid state physics and chemistry, and to biology and soft condensed matter properties

  17. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  18. Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory

    International Nuclear Information System (INIS)

    Chung, S.; Tye, S.H.

    1993-01-01

    The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory

  19. Spatial data modelling and maximum entropy theory

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2005-01-01

    Roč. 51, č. 2 (2005), s. 80-83 ISSN 0139-570X Institutional research plan: CEZ:AV0Z10750506 Keywords : spatial data classification * distribution function * error distribution Subject RIV: BD - Theory of Information

  20. Electroweak theory and the Standard Model

    CERN Multimedia

    CERN. Geneva; Giudice, Gian Francesco

    2004-01-01

    There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.

  1. Statistical Learning Theory: Models, Concepts, and Results

    OpenAIRE

    von Luxburg, Ulrike; Schoelkopf, Bernhard

    2008-01-01

    Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.

  2. Using and Developing Measurement Instruments in Science Education: A Rasch Modeling Approach. Science & Engineering Education Sources

    Science.gov (United States)

    Liu, Xiufeng

    2010-01-01

    This book meets a demand in the science education community for a comprehensive and introductory measurement book in science education. It describes measurement instruments reported in refereed science education research journals, and introduces the Rasch modeling approach to developing measurement instruments in common science assessment domains,…

  3. Glass Durability Modeling, Activated Complex Theory (ACT)

    International Nuclear Information System (INIS)

    CAROL, JANTZEN

    2005-01-01

    atomic ratios is shown to represent the structural effects of the glass on the dissolution and the formation of activated complexes in the glass leached layer. This provides two different methods by which a linear glass durability model can be formulated. One based on the quasi- crystalline mineral species in a glass and one based on cation ratios in the glass: both are related to the activated complexes on the surface by the law of mass action. The former would allow a new Thermodynamic Hydration Energy Model to be developed based on the hydration of the quasi-crystalline mineral species if all the pertinent thermodynamic data were available. Since the pertinent thermodynamic data is not available, the quasi-crystalline mineral species and the activated complexes can be related to cation ratios in the glass by the law of mass action. The cation ratio model can, thus, be used by waste form producers to formulate durable glasses based on fundamental structural and activated complex theories. Moreover, glass durability model based on atomic ratios simplifies HLW glass process control in that the measured ratios of only a few waste components and glass formers can be used to predict complex HLW glass performance with a high degree of accuracy, e.g. an R 2 approximately 0.97

  4. Development and evaluation of a social cognitive theory-based instrument to assess correlations for physical activity among people with spinal cord injury.

    Science.gov (United States)

    Wilroy, Jereme; Turner, Lori; Birch, David; Leaver-Dunn, Deidre; Hibberd, Elizabeth; Leeper, James

    2018-01-01

    People with spinal cord injury (SCI) are more susceptible to sedentary lifestyles because of the displacement of physical functioning and the copious barriers. Benefits of physical activity for people with SCI include physical fitness, functional capacity, social integration and psychological well-being. The purpose of this study was to develop and test a social cognitive theory-based instrument aimed to predict physical activity among people with SCI. An instrument was developed through the utilization and modification of previous items from the literature, an expert panel review, and cognitive interviewing, and tested among a sample of the SCI population using a cross-sectional design. Statistical analysis included descriptives, correlations, multiple regression, and exploratory factor analysis. The physical activity outcome variable was significantly and positively correlated with self-regulatory efficacy (r = 0.575), task self-efficacy (r = 0.491), self-regulation (r = 0.432), social support (r = 0.284), and outcome expectations (r = 0.247). Internal consistency for the constructs ranged from 0.82 to 0.96. Construct reliability values for the self-regulation (0.95), self-regulatory efficacy (0.96), task self-efficacy (0.94), social support (0.84), and outcome expectations (0.92) each exceeded the 0.70 a priori criteria. The factor analysis was conducted to seek modifications of current instrument to improve validity and reliability. The data provided support for the convergent validity of the five-factor SCT model. This study provides direction for further development of a valid and reliable instrument for predicting physical activity among people with SCI. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  6. The logical foundations of scientific theories languages, structures, and models

    CERN Document Server

    Krause, Decio

    2016-01-01

    This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...

  7. Efficacy of an extended theory of planned behaviour model for predicting caterers' hand hygiene practices.

    Science.gov (United States)

    Clayton, Deborah A; Griffith, Christopher J

    2008-04-01

    The main aim of this study was to determine the factors which influence caterers' hand hygiene practices using social cognitive theory. One hundred and fifteen food handlers from 29 catering businesses were observed carrying out 31,050 food preparation actions in their workplace. Caterers subsequently completed the Hand Hygiene Instrument (HHI), which ascertained attitudes towards hand hygiene using constructs from the Theory of Planned Behaviour (TPB) and the Health Belief Model. The TPB provided a useful framework for understanding caterers' implementation of hand hygiene practices, explaining 34% of the variance in hand hygiene malpractices (p behavioural control and intention (p food safety culture.

  8. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  9. Introduction to gauge theories and the Standard Model

    CERN Document Server

    de Wit, Bernard

    1995-01-01

    The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.

  10. A 'theory of everything'? [Extending the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1993-01-01

    The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)

  11. Neutron Star Models in Alternative Theories of Gravity

    Science.gov (United States)

    Manolidis, Dimitrios

    We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.

  12. Generalized algebra-valued models of set theory

    NARCIS (Netherlands)

    Löwe, B.; Tarafder, S.

    2015-01-01

    We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.

  13. A QCD Model Using Generalized Yang-Mills Theory

    International Nuclear Information System (INIS)

    Wang Dianfu; Song Heshan; Kou Lina

    2007-01-01

    Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.

  14. A review of organizational buyer behaviour models and theories ...

    African Journals Online (AJOL)

    Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...

  15. Instrument evaluation no. 11. ESI nuclear model 271 C contamination monitor

    International Nuclear Information System (INIS)

    Burgess, P.H.; Iles, W.J.

    1978-06-01

    The various radiations encountered in radiological protection cover a wide range of energies and radiation measurements have to he carried out under an equally broad spectrum of environmental conditions. This report is one of a series intended to give information on the performance characteristics of radiological protection instruments, to assist in the selection of appropriate instruments for a given purpose, to interpret the results obtained with such instruments, and, in particular, to know the likely sources and magnitude of errors that might be associated with measurements in the field. The radiation, electrical and environmental characteristics of radiation protection instruments are considered together with those aspects of the construction which make an instrument convenient for routine use. To provide consistent criteria for instrument performance, the range of tests performed on any particular class of instrument, the test methods and the criteria of acceptable performance are based broadly on the appropriate Recommendations of the International Electrotechnical Commission. The radiations in the tests are, in general, selected from the range of reference radiations for instrument calibration being drawn up by the International Standards Organisation. Normally, each report deals with the capabilities and limitations of one model of instrument and no direct comparison with other instruments intended for similar purposes is made, since the significance of particular performance characteristics largely depends on the radiations and environmental conditions in which the instrument is to be used. The results quoted here have all been obtained from tests on instruments in routine production, with the appropriate measurements being made by the NRPB. This report deals with the ESI Nuclear Model 271 C; a general purpose contamination monitor, comprising a GM tube connected by a coiled extensible cable to a ratemeter

  16. The discriminating capacity of a measuring instrument: Revisiting Bloom (1942’s theory and formula

    Directory of Open Access Journals (Sweden)

    Louis Laurencelle

    2014-04-01

    Full Text Available Discriminating capacity” is defined as a property of a test, measuring device or scholastic exam, which enables us to segregate and categorize objects or people according to their measured values. The concept, anticipated by Bloom and derived here from Ferguson’s index of classificatory power, is developed upon three bases: the probability of categorizing an object (or person in its proper measuring interval; the sufficient length of measuring intervals; the number of efficacious intervals in an empirical or theoretical distribution of measures. Expressed as a function of the reliability coefficient of a measuring device, discriminating capacity appears as a new tool in the conceptual apparatus of classical test theory.

  17. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto

    2008-01-01

    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  18. Developing evaluation instrument based on CIPP models on the implementation of portfolio assessment

    Science.gov (United States)

    Kurnia, Feni; Rosana, Dadan; Supahar

    2017-08-01

    This study aimed to develop an evaluation instrument constructed by CIPP model on the implementation of portfolio assessment in science learning. This study used research and development (R & D) method; adapting 4-D by the development of non-test instrument, and the evaluation instrument constructed by CIPP model. CIPP is the abbreviation of Context, Input, Process, and Product. The techniques of data collection were interviews, questionnaires, and observations. Data collection instruments were: 1) the interview guidelines for the analysis of the problems and the needs, 2) questionnaire to see level of accomplishment of portfolio assessment instrument, and 3) observation sheets for teacher and student to dig up responses to the portfolio assessment instrument. The data obtained was quantitative data obtained from several validators. The validators consist of two lecturers as the evaluation experts, two practitioners (science teachers), and three colleagues. This paper shows the results of content validity obtained from the validators and the analysis result of the data obtained by using Aikens' V formula. The results of this study shows that the evaluation instrument based on CIPP models is proper to evaluate the implementation of portfolio assessment instruments. Based on the experts' judgments, practitioners, and colleagues, the Aikens' V coefficient was between 0.86-1,00 which means that it is valid and can be used in the limited trial and operational field trial.

  19. Non-linear σ-models and string theories

    International Nuclear Information System (INIS)

    Sen, A.

    1986-10-01

    The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs

  20. Toric Methods in F-Theory Model Building

    Directory of Open Access Journals (Sweden)

    Johanna Knapp

    2011-01-01

    Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.

  1. Use of the mathematical modelling method for the investigation of dynamic characteristics of acoustical measuring instruments

    Science.gov (United States)

    Vasilyev, Y. M.; Lagunov, L. F.

    1973-01-01

    The schematic diagram of a noise measuring device is presented that uses pulse expansion modeling according to the peak or any other measured values, to obtain instrument readings at a very low noise error.

  2. Quantum Link Models and Quantum Simulation of Gauge Theories

    International Nuclear Information System (INIS)

    Wiese, U.J.

    2015-01-01

    This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)

  3. Strategies to Enhance Online Learning Teams. Team Assessment and Diagnostics Instrument and Agent-based Modeling

    Science.gov (United States)

    2010-08-12

    Strategies to Enhance Online Learning Teams Team Assessment and Diagnostics Instrument and Agent-based Modeling Tristan E. Johnson, Ph.D. Learning ...REPORT DATE AUG 2010 2. REPORT TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND SUBTITLE Strategies to Enhance Online Learning ...TeamsTeam Strategies to Enhance Online Learning Teams: Team Assessment and Diagnostics Instrument and Agent-based Modeling 5a. CONTRACT NUMBER 5b. GRANT

  4. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  5. The Self-Perception Theory vs. a Dynamic Learning Model

    OpenAIRE

    Swank, Otto H.

    2006-01-01

    Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...

  6. Ares I Scale Model Acoustic Tests Instrumentation for Acoustic and Pressure Measurements

    Science.gov (United States)

    Vargas, Magda B.; Counter, Douglas D.

    2011-01-01

    The Ares I Scale Model Acoustic Test (ASMAT) was a development test performed at the Marshall Space Flight Center (MSFC) East Test Area (ETA) Test Stand 116. The test article included a 5% scale Ares I vehicle model and tower mounted on the Mobile Launcher. Acoustic and pressure data were measured by approximately 200 instruments located throughout the test article. There were four primary ASMAT instrument suites: ignition overpressure (IOP), lift-off acoustics (LOA), ground acoustics (GA), and spatial correlation (SC). Each instrumentation suite incorporated different sensor models which were selected based upon measurement requirements. These requirements included the type of measurement, exposure to the environment, instrumentation check-outs and data acquisition. The sensors were attached to the test article using different mounts and brackets dependent upon the location of the sensor. This presentation addresses the observed effect of the sensors and mounts on the acoustic and pressure measurements.

  7. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  8. Measurement Models for Reasoned Action Theory

    OpenAIRE

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  9. Model instruments of effective segmentation of the fast food market

    Directory of Open Access Journals (Sweden)

    Mityaeva Tetyana L.

    2013-03-01

    Full Text Available The article presents results of optimisation step-type calculations of economic effectiveness of promotion of fast food with consideration of key parameters of assessment of efficiency of the marketing strategy of segmentation. The article justifies development of a mathematical model on the bases of 3D-presentations and three-dimensional system of management variables. The modern applied mathematical packages allow formation not only of one-dimensional and two-dimensional arrays and analyse links of variables, but also of three-dimensional, besides, the more links and parameters are taken into account, the more adequate and adaptive are results of modelling and, as a result, more informative and strategically valuable. The article shows modelling possibilities that allow taking into account strategies and reactions on formation of the marketing strategy under conditions of entering the fast food market segments.

  10. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  11. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...

  12. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  13. Theories of conduct disorder: a causal modelling analysis

    NARCIS (Netherlands)

    Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De

    2004-01-01

    Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –

  14. Models of Regge behaviour in an asymptotically free theory

    International Nuclear Information System (INIS)

    Polkinghorne, J.C.

    1976-01-01

    Two simple Feynman integral models are presented which reproduce the features expected to be of physical importance in the Regge behaviour of asymptotically free theories. Analysis confirms the result, expected on general grounds, that phi 3 in six dimensions has an essential singularity at l=-1. The extension to gauge theories is discussed. (Auth.)

  15. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    Science.gov (United States)

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Mobile Music, Sensors, Physical Modeling, and Digital Fabrication: Articulating the Augmented Mobile Instrument

    Directory of Open Access Journals (Sweden)

    Romain Michon

    2017-12-01

    Full Text Available Two concepts are presented, extended, and unified in this paper: mobile device augmentation towards musical instruments design and the concept of hybrid instruments. The first consists of using mobile devices at the heart of novel musical instruments. Smartphones and tablets are augmented with passive and active elements that can take part in the production of sound (e.g., resonators, exciter, etc., add new affordances to the device, or change its global aesthetics and shape. Hybrid instruments combine physical/acoustical and “physically informed” virtual/digital elements. Recent progress in physical modeling of musical instruments and digital fabrication is exploited to treat instrument parts in a multidimensional way, allowing any physical element to be substituted with a virtual one and vice versa (as long as it is physically possible. A wide range of tools to design mobile hybrid instruments is introduced and evaluated. Aesthetic and design considerations when making such instruments are also presented through a series of examples.

  17. Model instruments of effective segmentation of the fast food market

    OpenAIRE

    Mityaeva Tetyana L.

    2013-01-01

    The article presents results of optimisation step-type calculations of economic effectiveness of promotion of fast food with consideration of key parameters of assessment of efficiency of the marketing strategy of segmentation. The article justifies development of a mathematical model on the bases of 3D-presentations and three-dimensional system of management variables. The modern applied mathematical packages allow formation not only of one-dimensional and two-dimensional arrays and analyse ...

  18. Frequency-Zooming ARMA Modeling for Analysis of Noisy String Instrument Tones

    Directory of Open Access Journals (Sweden)

    Paulo A. A. Esquef

    2003-09-01

    Full Text Available This paper addresses model-based analysis of string instrument sounds. In particular, it reviews the application of autoregressive (AR modeling to sound analysis/synthesis purposes. Moreover, a frequency-zooming autoregressive moving average (FZ-ARMA modeling scheme is described. The performance of the FZ-ARMA method on modeling the modal behavior of isolated groups of resonance frequencies is evaluated for both synthetic and real string instrument tones immersed in background noise. We demonstrate that the FZ-ARMA modeling is a robust tool to estimate the decay time and frequency of partials of noisy tones. Finally, we discuss the use of the method in synthesis of string instrument sounds.

  19. Extended Nambu models: Their relation to gauge theories

    Science.gov (United States)

    Escobar, C. A.; Urrutia, L. F.

    2017-05-01

    Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.

  20. Linear control theory for gene network modeling.

    Science.gov (United States)

    Shin, Yong-Jun; Bleris, Leonidas

    2010-09-16

    Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.

  1. Polling models : from theory to traffic intersections

    NARCIS (Netherlands)

    Boon, M.A.A.

    2011-01-01

    The subject of the present monograph is the study of polling models, which are queueing models consisting of multiple queues, cyclically attended by one server. Polling models originated in the late 1950s, but did not receive much attention until the 1980s when an abundance of new applications arose

  2. Development of a dynamic computational model of social cognitive theory.

    Science.gov (United States)

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  3. Contribution to the study of conformal theories and integrable models

    International Nuclear Information System (INIS)

    Sochen, N.

    1992-05-01

    The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved

  4. Development of collaborative-creative learning model using virtual laboratory media for instrumental analytical chemistry lectures

    Science.gov (United States)

    Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian

    2017-08-01

    The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.

  5. TSI Model 3936 Scanning Mobility Particle Spectrometer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Kuang, C. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-02-01

    The Model 3936 Scanning Mobility Particle Spectrometer (SMPS) measures the size distribution of aerosols ranging from 10 nm up to 1000 nm. The SMPS uses a bipolar aerosol charger to keep particles within a known charge distribution. Charged particles are classified according to their electrical mobility, using a long-column differential mobility analyzer (DMA). Particle concentration is measured with a condensation particle counter (CPC). The SMPS is well-suited for applications including: nanoparticle research, atmospheric aerosol studies, pollution studies, smog chamber evaluations, engine exhaust and combustion studies, materials synthesis, filter efficiency testing, nucleation/condensation studies, and rapidly changing aerosol systems.

  6. Three level constraints on conformal field theories and string models

    International Nuclear Information System (INIS)

    Lewellen, D.C.

    1989-05-01

    Simple tree level constraints for conformal field theories which follow from the requirement of crossing symmetry of four-point amplitudes are presented, and their utility for probing general properties of string models is briefly illustrated and discussed. 9 refs

  7. Nematic elastomers: from a microscopic model to macroscopic elasticity theory.

    Science.gov (United States)

    Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette

    2008-05-01

    A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.

  8. Item response theory and structural equation modelling for ordinal data: Describing the relationship between KIDSCREEN and Life-H.

    Science.gov (United States)

    Titman, Andrew C; Lancaster, Gillian A; Colver, Allan F

    2016-10-01

    Both item response theory and structural equation models are useful in the analysis of ordered categorical responses from health assessment questionnaires. We highlight the advantages and disadvantages of the item response theory and structural equation modelling approaches to modelling ordinal data, from within a community health setting. Using data from the SPARCLE project focussing on children with cerebral palsy, this paper investigates the relationship between two ordinal rating scales, the KIDSCREEN, which measures quality-of-life, and Life-H, which measures participation. Practical issues relating to fitting models, such as non-positive definite observed or fitted correlation matrices, and approaches to assessing model fit are discussed. item response theory models allow properties such as the conditional independence of particular domains of a measurement instrument to be assessed. When, as with the SPARCLE data, the latent traits are multidimensional, structural equation models generally provide a much more convenient modelling framework. © The Author(s) 2013.

  9. Models and error analyses of measuring instruments in accountability systems in safeguards control

    International Nuclear Information System (INIS)

    Dattatreya, E.S.

    1977-05-01

    Essentially three types of measuring instruments are used in plutonium accountability systems: (1) the bubblers, for measuring the total volume of liquid in the holding tanks, (2) coulometers, titration apparatus and calorimeters, for measuring the concentration of plutonium; and (3) spectrometers, for measuring isotopic composition. These three classes of instruments are modeled and analyzed. Finally, the uncertainty in the estimation of total plutonium in the holding tank is determined

  10. Validation of self-directed learning instrument and establishment of normative data for nursing students in taiwan: using polytomous item response theory.

    Science.gov (United States)

    Cheng, Su-Fen; Lee-Hsieh, Jane; Turton, Michael A; Lin, Kuan-Chia

    2014-06-01

    Little research has investigated the establishment of norms for nursing students' self-directed learning (SDL) ability, recognized as an important capability for professional nurses. An item response theory (IRT) approach was used to establish norms for SDL abilities valid for the different nursing programs in Taiwan. The purposes of this study were (a) to use IRT with a graded response model to reexamine the SDL instrument, or the SDLI, originally developed by this research team using confirmatory factor analysis and (b) to establish SDL ability norms for the four different nursing education programs in Taiwan. Stratified random sampling with probability proportional to size was used. A minimum of 15% of students from the four different nursing education degree programs across Taiwan was selected. A total of 7,879 nursing students from 13 schools were recruited. The research instrument was the 20-item SDLI developed by Cheng, Kuo, Lin, and Lee-Hsieh (2010). IRT with the graded response model was used with a two-parameter logistic model (discrimination and difficulty) for the data analysis, calculated using MULTILOG. Norms were established using percentile rank. Analysis of item information and test information functions revealed that 18 items exhibited very high discrimination and two items had high discrimination. The test information function was higher in this range of scores, indicating greater precision in the estimate of nursing student SDL. Reliability fell between .80 and .94 for each domain and the SDLI as a whole. The total information function shows that the SDLI is appropriate for all nursing students, except for the top 2.5%. SDL ability norms were established for each nursing education program and for the nation as a whole. IRT is shown to be a potent and useful methodology for scale evaluation. The norms for SDL established in this research will provide practical standards for nursing educators and students in Taiwan.

  11. Soliton excitations in a class of nonlinear field theory models

    International Nuclear Information System (INIS)

    Makhan'kov, V.G.; Fedyanin, V.K.

    1985-01-01

    Investigation results of nonlinear models of the field theory with a lagrangian are described. The theory includes models both with zero stable vacuum epsilon=1 and with condensate epsilon=-1 (of disturbed symmetry). Conditions of existence of particle-like solutions (PLS), stability of these solutions are investigated. Soliton dynamics is studied. PLS formfactors are calculated. Statistical mechanics of solitons is built and their dynamic structure factors are calculated

  12. Two-matrix models and c =1 string theory

    International Nuclear Information System (INIS)

    Bonora, L.; Xiong Chuansheng

    1994-05-01

    We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)

  13. Planar N = 4 gauge theory and the Hubbard model

    International Nuclear Information System (INIS)

    Rej, Adam; Serban, Didina; Staudacher, Matthias

    2006-01-01

    Recently it was established that a certain integrable long-range spin chain describes the dilatation operator of N = 4 gauge theory in the su(2) sector to at least three-loop order, while exhibiting BMN scaling to all orders in perturbation theory. Here we identify this spin chain as an approximation to an integrable short-ranged model of strongly correlated electrons: The Hubbard model

  14. A multi-species exchange model for fully fluctuating polymer field theory simulations.

    Science.gov (United States)

    Düchs, Dominik; Delaney, Kris T; Fredrickson, Glenn H

    2014-11-07

    Field-theoretic models have been used extensively to study the phase behavior of inhomogeneous polymer melts and solutions, both in self-consistent mean-field calculations and in numerical simulations of the full theory capturing composition fluctuations. The models commonly used can be grouped into two categories, namely, species models and exchange models. Species models involve integrations of functionals that explicitly depend on fields originating both from species density operators and their conjugate chemical potential fields. In contrast, exchange models retain only linear combinations of the chemical potential fields. In the two-component case, development of exchange models has been instrumental in enabling stable complex Langevin (CL) simulations of the full complex-valued theory. No comparable stable CL approach has yet been established for field theories of the species type. Here, we introduce an extension of the exchange model to an arbitrary number of components, namely, the multi-species exchange (MSE) model, which greatly expands the classes of soft material systems that can be accessed by the complex Langevin simulation technique. We demonstrate the stability and accuracy of the MSE-CL sampling approach using numerical simulations of triblock and tetrablock terpolymer melts, and tetrablock quaterpolymer melts. This method should enable studies of a wide range of fluctuation phenomena in multiblock/multi-species polymer blends and composites.

  15. Scattering and short-distance properties in field theory models

    International Nuclear Information System (INIS)

    Iagolnitzer, D.

    1987-01-01

    The aim of constructive field theory is not only to define models but also to establish their general properties of physical interest. We here review recent works on scattering and on short-distance properties for weakly coupled theories with mass gap such as typically P(φ) in dimension 2, φ 4 in dimension 3 and the (renormalizable, asymptotically free) massive Gross-Neveu (GN) model in dimension 2. Many of the ideas would apply similarly to other (possibly non renormalizable) theories that might be defined in a similar way via phase-space analysis

  16. The monster sporadic group and a theory underlying superstring models

    International Nuclear Information System (INIS)

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs

  17. Consumer preference models: fuzzy theory approach

    Science.gov (United States)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  18. Narrative theories as computational models: reader-oriented theory and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Galloway, P.

    1983-12-01

    In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.

  19. The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories

    Directory of Open Access Journals (Sweden)

    Robert Logozar

    2011-12-01

    Full Text Available We present the modeling of dynamical systems and finding of their complexity indicators by the use of concepts from computation and information theories, within the framework of J. P. Crutchfield's theory of  ε-machines. A short formal outline of the  ε-machines is given. In this approach, dynamical systems are analyzed directly from the time series that is received from a properly adjusted measuring instrument. The binary strings are parsed through the parse tree, within which morphologically and probabilistically unique subtrees or morphs are recognized as system states. The outline and precise interrelation of the information-theoretic entropies and complexities emanating from the model is given. The paper serves also as a theoretical foundation for the future presentation of the DSA program that implements the  ε-machines modeling up to the stochastic finite automata level.

  20. A Dynamic Systems Theory Model of Visual Perception Development

    Science.gov (United States)

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  1. Membrane models and generalized Z2 gauge theories

    International Nuclear Information System (INIS)

    Lowe, M.J.; Wallace, D.J.

    1980-01-01

    We consider models of (d-n)-dimensional membranes fluctuating in a d-dimensional space under the action of surface tension. We investigate the renormalization properties of these models perturbatively and in 1/n expansion. The potential relationships of these models to generalized Z 2 gauge theories are indicated. (orig.)

  2. Theories and Frameworks for Online Education: Seeking an Integrated Model

    Science.gov (United States)

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  3. Linear control theory for gene network modeling.

    Directory of Open Access Journals (Sweden)

    Yong-Jun Shin

    Full Text Available Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain and linear state-space (time domain can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.

  4. Measurement Models for Reasoned Action Theory.

    Science.gov (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  5. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  6. Model Engine Performance Measurement From Force Balance Instrumentation

    Science.gov (United States)

    Jeracki, Robert J.

    1998-01-01

    A large scale model representative of a low-noise, high bypass ratio turbofan engine was tested for acoustics and performance in the NASA Lewis 9- by 15-Foot Low-Speed Wind Tunnel. This test was part of NASA's continuing Advanced Subsonic Technology Noise Reduction Program. The low tip speed fan, nacelle, and an un-powered core passage (with core inlet guide vanes) were simulated. The fan blades and hub are mounted on a rotating thrust and torque balance. The nacelle, bypass duct stators, and core passage are attached to a six component force balance. The two balance forces, when corrected for internal pressure tares, measure the total thrust-minus-drag of the engine simulator. Corrected for scaling and other effects, it is basically the same force that the engine supports would feel, operating at similar conditions. A control volume is shown and discussed, identifying the various force components of the engine simulator thrust and definitions of net thrust. Several wind tunnel runs with nearly the same hardware installed are compared, to identify the repeatability of the measured thrust-minus-drag. Other wind tunnel runs, with hardware changes that affected fan performance, are compared to the baseline configuration, and the thrust and torque effects are shown. Finally, a thrust comparison between the force balance and nozzle gross thrust methods is shown, and both yield very similar results.

  7. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  8. Baldrige Theory into Practice: A Generic Model

    Science.gov (United States)

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  9. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  10. Optimal transportation networks models and theory

    CERN Document Server

    Bernot, Marc; Morel, Jean-Michel

    2009-01-01

    The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.

  11. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers

    Directory of Open Access Journals (Sweden)

    Paul Branscum

    2016-06-01

    Full Text Available Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study’s purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child’s consumption of fruits and vegetables and sugar-sweetened beverages (SSB. Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB. Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis, and internal consistency reliability (Cronbach’s alpha. Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers’ consumption of fruits and vegetables, and SSB.

  12. The SMART Theory and Modeling Team: An Integrated Element of Mission Development and Science Analysis

    Science.gov (United States)

    Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.

    2005-01-01

    When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored

  13. The Relevance of Using Mathematical Models in Macroeconomic Policies Theory

    Directory of Open Access Journals (Sweden)

    Nora Mihail

    2006-11-01

    Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.

  14. The Relevance of Using Mathematical Models in Macroeconomic Policies Theory

    Directory of Open Access Journals (Sweden)

    Nora Mihail

    2006-09-01

    Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.

  15. Fire and Heat Spreading Model Based on Cellular Automata Theory

    Science.gov (United States)

    Samartsev, A. A.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.; Fominykh, D. S.

    2018-05-01

    The distinctive feature of the proposed fire and heat spreading model in premises is the reduction of the computational complexity due to the use of the theory of cellular automata with probability rules of behavior. The possibilities and prospects of using this model in practice are noted. The proposed model has a simple mechanism of integration with agent-based evacuation models. The joint use of these models could improve floor plans and reduce the time of evacuation from premises during fires.

  16. Matrix model as a mirror of Chern-Simons theory

    International Nuclear Information System (INIS)

    Aganagic, Mina; Klemm, Albrecht; Marino, Marcos; Vafa, Cumrun

    2004-01-01

    Using mirror symmetry, we show that Chern-Simons theory on certain manifolds such as lens spaces reduces to a novel class of Hermitian matrix models, where the measure is that of unitary matrix models. We show that this agrees with the more conventional canonical quantization of Chern-Simons theory. Moreover, large N dualities in this context lead to computation of all genus A-model topological amplitudes on toric Calabi-Yau manifolds in terms of matrix integrals. In the context of type IIA superstring compactifications on these Calabi-Yau manifolds with wrapped D6 branes (which are dual to M-theory on G2 manifolds) this leads to engineering and solving F-terms for N=1 supersymmetric gauge theories with superpotentials involving certain multi-trace operators. (author)

  17. The development and psychometric testing of a theory-based instrument to evaluate nurses' perception of clinical reasoning competence.

    Science.gov (United States)

    Liou, Shwu-Ru; Liu, Hsiu-Chen; Tsai, Hsiu-Min; Tsai, Ying-Huang; Lin, Yu-Ching; Chang, Chia-Hao; Cheng, Ching-Yu

    2016-03-01

    The purpose of the study was to develop and psychometrically test the Nurses Clinical Reasoning Scale. Clinical reasoning is an essential skill for providing safe and quality patient care. Identifying pre-graduates' and nurses' needs and designing training courses to improve their clinical reasoning competence becomes a critical task. However, there is no instrument focusing on clinical reasoning in the nursing profession. Cross-sectional design was used. This study included the development of the scale, a pilot study that preliminary tested the readability and reliability of the developed scale and a main study that implemented and tested the psychometric properties of the developed scale. The Nurses Clinical Reasoning Scale was developed based on the Clinical Reasoning Model. The scale includes 15 items using a Likert five-point scale. Data were collected from 2013-2014. Two hundred and fifty-one participants comprising clinical nurses and nursing pre-graduates completed and returned the questionnaires in the main study. The instrument was tested for internal consistency and test-retest reliability. Its validity was tested with content, construct and known-groups validity. One factor emerged from the factor analysis. The known-groups validity was confirmed. The Cronbach's alpha for the entire instrument was 0·9. The reliability and validity of the Nurses Clinical Reasoning Scale were supported. The scale is a useful tool and can be easily administered for the self-assessment of clinical reasoning competence of clinical nurses and future baccalaureate nursing graduates. Study limitations and further recommendations are discussed. © 2015 John Wiley & Sons Ltd.

  18. Mixed models theory and applications with R

    CERN Document Server

    Demidenko, Eugene

    2013-01-01

    Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g

  19. Nonconvex Model of Material Growth: Mathematical Theory

    Science.gov (United States)

    Ganghoffer, J. F.; Plotnikov, P. I.; Sokolowski, J.

    2018-06-01

    The model of volumetric material growth is introduced in the framework of finite elasticity. The new results obtained for the model are presented with complete proofs. The state variables include the deformations, temperature and the growth factor matrix function. The existence of global in time solutions for the quasistatic deformations boundary value problem coupled with the energy balance and the evolution of the growth factor is shown. The mathematical results can be applied to a wide class of growth models in mechanics and biology.

  20. Solid mechanics theory, modeling, and problems

    CERN Document Server

    Bertram, Albrecht

    2015-01-01

    This textbook offers an introduction to modeling the mechanical behavior of solids within continuum mechanics and thermodynamics. To illustrate the fundamental principles, the book starts with an overview of the most important models in one dimension. Tensor calculus, which is called for in three-dimensional modeling, is concisely presented in the second part of the book. Once the reader is equipped with these essential mathematical tools, the third part of the book develops the foundations of continuum mechanics right from the beginning. Lastly, the book’s fourth part focuses on modeling the mechanics of materials and in particular elasticity, viscoelasticity and plasticity. Intended as an introductory textbook for students and for professionals interested in self-study, it also features numerous worked-out examples to aid in understanding.

  1. Modeling workplace bullying using catastrophe theory.

    Science.gov (United States)

    Escartin, J; Ceja, L; Navarro, J; Zapf, D

    2013-10-01

    Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.

  2. Testing of Environmental Satellite Bus-Instrument Interfaces Using Engineering Models

    Science.gov (United States)

    Gagnier, Donald; Hayner, Rick; Nosek, Thomas; Roza, Michael; Hendershot, James E.; Razzaghi, Andrea I.

    2004-01-01

    This paper discusses the formulation and execution of a laboratory test of the electrical interfaces between multiple atmospheric scientific instruments and the spacecraft bus that carries them. The testing, performed in 2002, used engineering models of the instruments and the Aura spacecraft bus electronics. Aura is one of NASA s Earth Observatory System missions. The test was designed to evaluate the complex interfaces in the command and data handling subsystems prior to integration of the complete flight instruments on the spacecraft. A problem discovered during the flight integration phase of the observatory can cause significant cost and schedule impacts. The tests successfully revealed problems and led to their resolution before the full-up integration phase, saving significant cost and schedule. This approach could be beneficial for future environmental satellite programs involving the integration of multiple, complex scientific instruments onto a spacecraft bus.

  3. TYCHO Brahe's Empiric Methods, His Instruments, His Sudden Escape from Denmark and a New Theory About His Death

    Science.gov (United States)

    Thykier, C.

    1992-07-01

    Tycho Brahe (1546-1601) was born a noble being, a son of Otto Brahe, and a member of the Royal Danish Council. Very early he developed a great interest in science and especially astronomy. In 1575 Tycho visited the learned Prince Wilhelm II in Kassel. Here he was inspired by the famous instrument maker Burgi to build new precise astronomical instruments, and on the recommendation of Wilhelm King Frederic II of Denmark was given the island Hven (which at that time belonged to Denmark) as an entailed estate. At 26 years old, Tycho became famous for his work DE NOVA STELLA on the supernova that brightened up in 1572, and since this phenomenon kept its position fixed among the stars, it immediately invalidated the Aristotelian dogma of the invariability of the fixed-star world. In 1577 Tycho observed the great comet and followed its celestial motion by means of a quadrant and a sextant. He then came to the conclusion that the comet orbit moved far out among the planets, in contradiction to the Aristotelian dogma of the crystal spheres for the planets. However, Tycho's great contribution to science was his construction of the observatory buildings Uraniborg and Stjerneborg ("Star Castle") with their equipment of ancient sighting instruments and his use of these instruments without telescopes for observations of the planets over a period of almost 20 years. Tycho's work is collected in 15 volumes, OPERA OMNIA by J. L. E. Dreyer. Tycho also mapped Hven correctly and he triangulated both sides of Oresund relative to Hven. When Tycho moved to Prague in 1599 he lived there for a couple of years and met Kepler who became his assistant and collaborator. Kepler was the one who analyzed Tycho's material and derived the Keplerian laws for the motions of the planets. On this basis Newton derived the law of gravitation. Tycho Brahe has been considered the father of modern empirical science. In 1596 he was accused of negligence of his administrative duties and several other things

  4. Spatial interaction models facility location using game theory

    CERN Document Server

    D'Amato, Egidio; Pardalos, Panos

    2017-01-01

    Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.

  5. Electrorheological fluids modeling and mathematical theory

    CERN Document Server

    Růžička, Michael

    2000-01-01

    This is the first book to present a model, based on rational mechanics of electrorheological fluids, that takes into account the complex interactions between the electromagnetic fields and the moving liquid. Several constitutive relations for the Cauchy stress tensor are discussed. The main part of the book is devoted to a mathematical investigation of a model possessing shear-dependent viscosities, proving the existence and uniqueness of weak and strong solutions for the steady and the unsteady case. The PDS systems investigated possess so-called non-standard growth conditions. Existence results for elliptic systems with non-standard growth conditions and with a nontrivial nonlinear r.h.s. and the first ever results for parabolic systems with a non-standard growth conditions are given for the first time. Written for advanced graduate students, as well as for researchers in the field, the discussion of both the modeling and the mathematics is self-contained.

  6. An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory.

    Science.gov (United States)

    Komashie, Alexander; Mousavi, Ali; Clarkson, P John; Young, Terry

    2015-01-01

    This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation.

  7. Density functional theory and multiscale materials modeling

    Indian Academy of Sciences (India)

    One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids.

  8. Results of the first tests of the SIDRA satellite-borne instrument breadboard model

    International Nuclear Information System (INIS)

    Dudnik, O.V.; Kurbatov, E.V.; Avilov, A.M.; Titov, K.G.; Prieto, M; Sanchez, S.; Spassky, A.V.; Sylwester, J.; Gburek, S.; Podgorski, P.

    2013-01-01

    In this work, the results of the calibration of the solid-state detectors and electronic channels of the SIDRA satellite borne energetic charged particle spectrometer-telescope breadboard model are presented. The block schemes and experimental equipment used to conduct the thermal vacuum and electromagnetic compatibility tests of the assemblies and modules of the compact satellite equipment are described. The results of the measured thermal conditions of operation of the signal analog and digital processing critical modules of the SIDRA instrument prototype are discussed. Finally, the levels of conducted interference generated by the instrument model in the primary vehicle-borne power circuits are presented.

  9. Toda theories, W-algebras, and minimal models

    International Nuclear Information System (INIS)

    Mansfield, P.; Spence, B.

    1991-01-01

    We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)

  10. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  11. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  12. Fuzzy Stochastic Optimization Theory, Models and Applications

    CERN Document Server

    Wang, Shuming

    2012-01-01

    Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies.   The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...

  13. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars

    2017-01-01

    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  14. An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories

    International Nuclear Information System (INIS)

    Schiappa, Ricardo; Wyllard, Niclas

    2010-01-01

    We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.

  15. Lenses on reading an introduction to theories and models

    CERN Document Server

    Tracey, Diane H

    2017-01-01

    Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a

  16. Perturbation theory instead of large scale shell model calculations

    International Nuclear Information System (INIS)

    Feldmeier, H.; Mankos, P.

    1977-01-01

    Results of large scale shell model calculations for (sd)-shell nuclei are compared with a perturbation theory provides an excellent approximation when the SU(3)-basis is used as a starting point. The results indicate that perturbation theory treatment in an SU(3)-basis including 2hω excitations should be preferable to a full diagonalization within the (sd)-shell. (orig.) [de

  17. Scaling theory of depinning in the Sneppen model

    International Nuclear Information System (INIS)

    Maslov, S.; Paczuski, M.

    1994-01-01

    We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. 69, 3539 (1992)]. This theory is based on a ''gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, ν parallel (d) and ν perpendicular (d), characterizing the divergence of the parallel and perpendicular correlation lengths as the interface approaches its dynamical attractor

  18. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  19. The Use of Modelling for Theory Building in Qualitative Analysis

    Science.gov (United States)

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  20. Goodness-of-Fit Assessment of Item Response Theory Models

    Science.gov (United States)

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  1. A decision model for financial assurance instruments in the upstream petroleum sector

    International Nuclear Information System (INIS)

    Ferreira, Doneivan; Suslick, Saul; Farley, Joshua; Costanza, Robert; Krivov, Sergey

    2004-01-01

    The main objective of this paper is to deepen the discussion regarding the application of financial assurance instruments, bonds, in the upstream oil sector. This paper will also attempt to explain the current choice of instruments within the sector. The concepts of environmental damages and internalization of environmental and regulatory costs will be briefly explored. Bonding mechanisms are presently being adopted by several governments with the objective of guaranteeing the availability of funds for end-of-leasing operations. Regulators are mainly concerned with the prospect of inheriting liabilities from lessees. Several forms of bonding instruments currently available were identified and a new instrument classification was proposed. Ten commonly used instruments were selected and analyzed under the perspective of both regulators and industry (surety, paid-in and periodic-payment collateral accounts, letters of credit, self-guarantees, investment grade securities, real estate collaterals, insurance policies, pools, and special funds). A multiattribute value function model was then proposed to examine current instrument preferences. Preliminary simulations confirm the current scenario where regulators are likely to require surety bonds, letters of credit, and periodic payment collateral account tools

  2. Optimal velocity difference model for a car-following theory

    International Nuclear Information System (INIS)

    Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.

    2011-01-01

    In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.

  3. Advances in cognitive theory and therapy: the generic cognitive model.

    Science.gov (United States)

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  4. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-15

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  5. Instrumental analysis

    International Nuclear Information System (INIS)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-01

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  6. New earth system model for optical performance evaluation of space instruments.

    Science.gov (United States)

    Ryu, Dongok; Kim, Sug-Whan; Breault, Robert P

    2017-03-06

    In this study, a new global earth system model is introduced for evaluating the optical performance of space instruments. Simultaneous imaging and spectroscopic results are provided using this global earth system model with fully resolved spatial, spectral, and temporal coverage of sub-models of the Earth. The sun sub-model is a Lambertian scattering sphere with a 6-h scale and 295 lines of solar spectral irradiance. The atmospheric sub-model has a 15-layer three-dimensional (3D) ellipsoid structure. The land sub-model uses spectral bidirectional reflectance distribution functions (BRDF) defined by a semi-empirical parametric kernel model. The ocean is modeled with the ocean spectral albedo after subtracting the total integrated scattering of the sun-glint scatter model. A hypothetical two-mirror Cassegrain telescope with a 300-mm-diameter aperture and 21.504 mm × 21.504-mm focal plane imaging instrument is designed. The simulated image results are compared with observational data from HRI-VIS measurements during the EPOXI mission for approximately 24 h from UTC Mar. 18, 2008. Next, the defocus mapping result and edge spread function (ESF) measuring result show that the distance between the primary and secondary mirror increases by 55.498 μm from the diffraction-limited condition. The shift of the focal plane is determined to be 5.813 mm shorter than that of the defocused focal plane, and this result is confirmed through the estimation of point spread function (PSF) measurements. This study shows that the earth system model combined with an instrument model is a powerful tool that can greatly help the development phase of instrument missions.

  7. Applications of generalizability theory and their relations to classical test theory and structural equation modeling.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-03-01

    Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    Science.gov (United States)

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  9. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  10. M-Theory Model-Building and Proton Stability

    CERN Document Server

    Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.

    1998-01-01

    We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  11. M-theory model-building and proton stability

    International Nuclear Information System (INIS)

    Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens

    1997-09-01

    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory

  12. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  13. Theory to practice: the humanbecoming leading-following model.

    Science.gov (United States)

    Ursel, Karen L

    2015-01-01

    Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.

  14. Theory of Time beyond the standard model

    International Nuclear Information System (INIS)

    Poliakov, Eugene S.

    2008-01-01

    A frame of non-uniform time is discussed. A concept of 'flow of time' is presented. The principle of time relativity in analogy with Galilean principle of relativity is set. Equivalence principle is set to state that the outcome of non-uniform time in an inertial frame of reference is equivalent to the outcome of a fictitious gravity force external to the frame of reference. Thus it is flow of time that causes gravity rather than mass. The latter is compared to experimental data achieving precision of up to 0.0003%. It is shown that the law of energy conservation is inapplicable to the frames of non-uniform time. A theoretical model of a physical entity (point mass, photon) travelling in the field of non-uniform time is considered. A generalized law that allows the flow of time to replace classical energy conservation is introduced on the basis of the experiment of Pound and Rebka. It is shown that linear dependence of flow of time on spatial coordinate conforms the inverse square law of universal gravitation and Keplerian mechanics. Momentum is shown to still be conserved

  15. Standard Model theory calculations and experimental tests

    International Nuclear Information System (INIS)

    Cacciari, M.; Hamel de Monchenault, G.

    2015-01-01

    To present knowledge, all the physics at the Large Hadron Collider (LHC) can be described in the framework of the Standard Model (SM) of particle physics. Indeed the newly discovered Higgs boson with a mass close to 125 GeV seems to confirm the predictions of the SM. Thus, besides looking for direct manifestations of the physics beyond the SM, one of the primary missions of the LHC is to perform ever more stringent tests of the SM. This requires not only improved theoretical developments to produce testable predictions and provide experiments with reliable event generators, but also sophisticated analyses techniques to overcome the formidable experimental environment of the LHC and perform precision measurements. In the first section, we describe the state of the art of the theoretical tools and event generators that are used to provide predictions for the production cross sections of the processes of interest. In section 2, inclusive cross section measurements with jets, leptons and vector bosons are presented. Examples of differential cross sections, charge asymmetries and the study of lepton pairs are proposed in section 3. Finally, in section 4, we report studies on the multiple production of gauge bosons and constraints on anomalous gauge couplings

  16. Developing Learning Model Based on Local Culture and Instrument for Mathematical Higher Order Thinking Ability

    Science.gov (United States)

    Saragih, Sahat; Napitupulu, E. Elvis; Fauzi, Amin

    2017-01-01

    This research aims to develop a student-centered learning model based on local culture and instrument of mathematical higher order thinking of junior high school students in the frame of the 2013-Curriculum in North Sumatra, Indonesia. The subjects of the research are seventh graders which are taken proportionally random consisted of three public…

  17. Realization of computer-controlled CAMAC model through the technology of virtual instrument

    International Nuclear Information System (INIS)

    Le Yi; Li Cheng; Liao Juanjuan; Zhou Xin

    1997-01-01

    The author is to introduce virtual instrument system and basic features of its typical software development platform, and show this system's superiority and fitness to physical experiments by the example of the CAMAC model ADC2249A, which is often used in nuclear physics experiments

  18. Cost prediction model for various payloads and instruments for the Space Shuttle Orbiter

    Science.gov (United States)

    Hoffman, F. E.

    1984-01-01

    The following cost parameters of the space shuttle were undertaken: (1) to develop a cost prediction model for various payload classes of instruments and experiments for the Space Shuttle Orbiter; and (2) to show the implications of various payload classes on the cost of: reliability analysis, quality assurance, environmental design requirements, documentation, parts selection, and other reliability enhancing activities.

  19. A mathematical model for describing the mechanical behaviour of root canal instruments.

    Science.gov (United States)

    Zhang, E W; Cheung, G S P; Zheng, Y F

    2011-01-01

    The purpose of this study was to establish a general mathematical model for describing the mechanical behaviour of root canal instruments by combining a theoretical analytical approach with a numerical finite-element method. Mathematical formulas representing the longitudinal (taper, helical angle and pitch) and cross-sectional configurations and area, the bending and torsional inertia, the curvature of the boundary point and the (geometry of) loading condition were derived. Torsional and bending stresses and the resultant deformation were expressed mathematically as a function of these geometric parameters, modulus of elasticity of the material and the applied load. As illustrations, three brands of NiTi endodontic files of different cross-sectional configurations (ProTaper, Hero 642, and Mani NRT) were analysed under pure torsion and pure bending situation by entering the model into a finite-element analysis package (ANSYS). Numerical results confirmed that mathematical models were a feasible method to analyse the mechanical properties and predict the stress and deformation for root canal instruments during root canal preparation. Mathematical and numerical model can be a suitable way to examine mechanical behaviours as a criterion of the instrument design and to predict the stress and strain experienced by the endodontic instruments during root canal preparation. © 2010 International Endodontic Journal.

  20. The Scanning Theremin Microscope: A Model Scanning Probe Instrument for Hands-On Activities

    Science.gov (United States)

    Quardokus, Rebecca C.; Wasio, Natalie A.; Kandel, S. Alex

    2014-01-01

    A model scanning probe microscope, designed using similar principles of operation to research instruments, is described. Proximity sensing is done using a capacitance probe, and a mechanical linkage is used to scan this probe across surfaces. The signal is transduced as an audio tone using a heterodyne detection circuit analogous to that used in…

  1. Understanding and Measuring Evaluation Capacity: A Model and Instrument Validation Study

    Science.gov (United States)

    Taylor-Ritzler, Tina; Suarez-Balcazar, Yolanda; Garcia-Iriarte, Edurne; Henry, David B.; Balcazar, Fabricio E.

    2013-01-01

    This study describes the development and validation of the Evaluation Capacity Assessment Instrument (ECAI), a measure designed to assess evaluation capacity among staff of nonprofit organizations that is based on a synthesis model of evaluation capacity. One hundred and sixty-nine staff of nonprofit organizations completed the ECAI. The 68-item…

  2. Models with oscillator terms in noncommutative quantum field theory

    International Nuclear Information System (INIS)

    Kronberger, E.

    2010-01-01

    The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de

  3. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  4. A spatial Mankiw-Romer-Weil model: Theory and evidence

    OpenAIRE

    Fischer, Manfred M.

    2009-01-01

    This paper presents a theoretical growth model that extends the Mankiw-Romer-Weil [MRW] model by accounting for technological interdependence among regional economies. Interdependence is assumed to work through spatial externalities caused by disembodied knowledge diffusion. The transition from theory to econometrics leads to a reduced-form empirical spatial Durbin model specification that explains the variation in regional levels of per worker output at steady state. A system ...

  5. Reservoir theory, groundwater transit time distributions, and lumped parameter models

    International Nuclear Information System (INIS)

    Etcheverry, D.; Perrochet, P.

    1999-01-01

    The relation between groundwater residence times and transit times is given by the reservoir theory. It allows to calculate theoretical transit time distributions in a deterministic way, analytically, or on numerical models. Two analytical solutions validates the piston flow and the exponential model for simple conceptual flow systems. A numerical solution of a hypothetical regional groundwater flow shows that lumped parameter models could be applied in some cases to large-scale, heterogeneous aquifers. (author)

  6. Theory of compressive modeling and simulation

    Science.gov (United States)

    Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith

    2013-05-01

    Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .

  7. Consistent constraints on the Standard Model Effective Field Theory

    International Nuclear Information System (INIS)

    Berthier, Laure; Trott, Michael

    2016-01-01

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.

  8. Effective potential in Lorentz-breaking field theory models

    Energy Technology Data Exchange (ETDEWEB)

    Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)

    2017-12-15

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  9. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H

    2012-01-01

    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  10. Effective potential in Lorentz-breaking field theory models

    International Nuclear Information System (INIS)

    Baeta Scarpelli, A.P.; Brito, L.C.T.; Felipe, J.C.C.; Nascimento, J.R.; Petrov, A.Yu.

    2017-01-01

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  11. Methods and Models of Market Risk Stress-Testing of the Portfolio of Financial Instruments

    Directory of Open Access Journals (Sweden)

    Alexander M. Karminsky

    2015-01-01

    Full Text Available Amid instability of financial markets and macroeconomic situation the necessity of improving bank risk-management instrument arises. New economic reality defines the need for searching for more advanced approaches of estimating banks vulnerability to exceptional, but plausible events. Stress-testing belongs to such instruments. The paper reviews and compares the models of market risk stress-testing of the portfolio of different financial instruments. These days the topic of the paper is highly acute due to the fact that now stress-testing is becoming an integral part of anticrisis risk-management amid macroeconomic instability and appearance of new risks together with close interest to the problem of risk-aggregation. The paper outlines the notion of stress-testing and gives coverage of goals, functions of stress-tests and main criteria for market risk stress-testing classification. The paper also stresses special aspects of scenario analysis. Novelty of the research is explained by elaborating the programme of aggregated complex multifactor stress-testing of the portfolio risk based on scenario analysis. The paper highlights modern Russian and foreign models of stress-testing both on solo-basis and complex. The paper lays emphasis on the results of stress-testing and revaluations of positions for all three complex models: methodology of the Central Bank of stress-testing portfolio risk, model relying on correlations analysis and copula model. The models of stress-testing on solo-basis are different for each financial instrument. Parametric StressVaR model is applicable to shares and options stress-testing;model based on "Grek" indicators is used for options; for euroobligation regional factor model is used. Finally some theoretical recommendations about managing market risk of the portfolio are given.

  12. Automatic creation of Markov models for reliability assessment of safety instrumented systems

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2008-01-01

    After the release of new international functional safety standards like IEC 61508, people care more for the safety and availability of safety instrumented systems. Markov analysis is a powerful and flexible technique to assess the reliability measurements of safety instrumented systems, but it is fallible and time-consuming to create Markov models manually. This paper presents a new technique to automatically create Markov models for reliability assessment of safety instrumented systems. Many safety related factors, such as failure modes, self-diagnostic, restorations, common cause and voting, are included in Markov models. A framework is generated first based on voting, failure modes and self-diagnostic. Then, repairs and common-cause failures are incorporated into the framework to build a complete Markov model. Eventual simplification of Markov models can be done by state merging. Examples given in this paper show how explosively the size of Markov model increases as the system becomes a little more complicated as well as the advancement of automatic creation of Markov models

  13. A Method for Modeling the Virtual Instrument Automatic Test System Based on the Petri Net

    Institute of Scientific and Technical Information of China (English)

    MA Min; CHEN Guang-ju

    2005-01-01

    Virtual instrument is playing the important role in automatic test system. This paper introduces a composition of a virtual instrument automatic test system and takes the VXIbus based a test software platform which is developed by CAT lab of the UESTC as an example. Then a method to model this system based on Petri net is proposed. Through this method, we can analyze the test task scheduling to prevent the deadlock or resources conflict. At last, this paper analyzes the feasibility of this method.

  14. Integrable models in 1+1 dimensional quantum field theory

    International Nuclear Information System (INIS)

    Faddeev, Ludvig.

    1982-09-01

    The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR

  15. A model of PCF in guarded type theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  16. A Model of PCF in Guarded Type Theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  17. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  18. Development of a Symptom-Based Patient-Reported Outcome Instrument for Functional Dyspepsia: A Preliminary Conceptual Model and an Evaluation of the Adequacy of Existing Instruments.

    Science.gov (United States)

    Taylor, Fiona; Reasner, David S; Carson, Robyn T; Deal, Linda S; Foley, Catherine; Iovin, Ramon; Lundy, J Jason; Pompilus, Farrah; Shields, Alan L; Silberg, Debra G

    2016-10-01

    The aim was to document, from the perspective of the empirical literature, the primary symptoms of functional dyspepsia (FD), evaluate the extent to which existing questionnaires target those symptoms, and, finally, identify any missing evidence that would impact the questionnaires' use in regulated clinical trials to assess treatment efficacy claims intended for product labeling. A literature review was conducted to identify the primary symptoms of FD and existing symptom-based FD patient-reported outcome (PRO) instruments. Following a database search, abstracts were screened and articles were retrieved for review. The primary symptoms of FD were organized into a conceptual model and the PRO instruments were evaluated for conceptual coverage as well as compared against evidentiary requirements presented in the FDA's PRO Guidance for Industry. Fifty-six articles and 16 instruments assessing FD symptoms were reviewed. Concepts listed in the Rome III criteria for FD (n = 7), those assessed by existing FD instruments (n = 34), and symptoms reported by patients in published qualitative research (n = 6) were summarized in the FD conceptual model. Except for vomiting, all of the identified symptoms from the published qualitative research reports were also specified in the Rome III criteria. Only three of the 16 instruments, the Dyspepsia Symptom Severity Index (DSSI), Nepean Dyspepsia Index (NDI), and Short-Form Nepean Dyspepsia Index (SF-NDI), measure all seven FD symptoms defined by the Rome III criteria. Among these three, each utilizes a 2-week recall period and 5-point Likert-type scale, and had evidence of patient involvement in development. Despite their coverage, when these instruments were evaluated in light of regulatory expectations, several issues jeopardized their potential qualification for substantiation of a labeling claim. No existing PRO instruments that measured all seven symptoms adhered to the regulatory principles necessary to support product

  19. Numerical tools for musical instruments acoustics: analysing nonlinear physical models using continuation of periodic solutions

    OpenAIRE

    Karkar , Sami; Vergez , Christophe; Cochelin , Bruno

    2012-01-01

    International audience; We propose a new approach based on numerical continuation and bifurcation analysis for the study of physical models of instruments that produce self- sustained oscillation. Numerical continuation consists in following how a given solution of a set of equations is modified when one (or several) parameter of these equations are allowed to vary. Several physical models (clarinet, saxophone, and violin) are formulated as nonlinear dynamical systems, whose periodic solution...

  20. A simple model explaining super-resolution in absolute optical instruments

    Science.gov (United States)

    Leonhardt, Ulf; Sahebdivan, Sahar; Kogan, Alex; Tyc, Tomáš

    2015-05-01

    We develop a simple, one-dimensional model for super-resolution in absolute optical instruments that is able to describe the interplay between sources and detectors. Our model explains the subwavelength sensitivity of a point detector to a point source reported in previous computer simulations and experiments (Miñano 2011 New J. Phys.13 125009; Miñano 2014 New J. Phys.16 033015).

  1. Development of a Conceptual Model and Survey Instrument to Measure Conscientious Objection to Abortion Provision.

    Directory of Open Access Journals (Sweden)

    Laura Florence Harris

    Full Text Available Conscientious objection to abortion, clinicians' refusal to perform legal abortions because of their religious or moral beliefs, has been the subject of increasing debate among bioethicists, policymakers, and public health advocates in recent years. Conscientious objection policies are intended to balance reproductive rights and clinicians' beliefs. However, in practice, clinician objection can act as a barrier to abortion access-impinging on reproductive rights, and increasing unsafe abortion and related morbidity and mortality. There is little information about conscientious objection from a medical or public health perspective. A quantitative instrument is needed to assess prevalence of conscientious objection and to provide insight on its practice. This paper describes the development of a survey instrument to measure conscientious objection to abortion provision.A literature review, and in-depth formative interviews with stakeholders in Colombia were used to develop a conceptual model of conscientious objection. This model led to the development of a survey, which was piloted, and then administered, in Ghana.The model posits three domains of conscientious objection that form the basis for the survey instrument: 1 beliefs about abortion and conscientious objection; 2 actions related to conscientious objection and abortion; and 3 self-identification as a conscientious objector.The instrument is intended to be used to assess prevalence among clinicians trained to provide abortions, and to gain insight on how conscientious objection is practiced in a variety of settings. Its results can inform more effective and appropriate strategies to regulate conscientious objection.

  2. Suprathermal ions in the solar wind from the Voyager spacecraft: Instrument modeling and background analysis

    International Nuclear Information System (INIS)

    Randol, B M; Christian, E R

    2015-01-01

    Using publicly available data from the Voyager Low Energy Charged Particle (LECP) instruments, we investigate the form of the solar wind ion suprathermal tail in the outer heliosphere inside the termination shock. This tail has a commonly observed form in the inner heliosphere, that is, a power law with a particular spectral index. The Voyager spacecraft have taken data beyond 100 AU, farther than any other spacecraft. However, during extended periods of time, the data appears to be mostly background. We have developed a technique to self-consistently estimate the background seen by LECP due to cosmic rays using data from the Voyager cosmic ray instruments and a simple, semi-analytical model of the LECP instruments

  3. Combining climate and energy policies: synergies or antagonism? Modeling interactions with energy efficiency instruments

    International Nuclear Information System (INIS)

    Lecuyer, Oskar; Bibas, Ruben

    2012-01-01

    In addition to the already present Climate and Energy package, the European Union (EU) plans to include a binding target to reduce energy consumption. We analyze the rationales the EU invokes to justify such an overlapping and develop a minimal common framework to study interactions arising from the combination of instruments reducing emissions, promoting renewable energy (RE) production and reducing energy demand through energy efficiency (EE) investments. We find that although all instruments tend to reduce GHG emissions and although a price on carbon tends also to give the right incentives for RE and EE, the combination of more than one instrument leads to significant antagonisms regarding major objectives of the policy package. The model allows to show in a single framework and to quantify the antagonistic effects of the joint promotion of RE and EE. We also show and quantify the effects of this joint promotion on ETS permit price, on wholesale market price and on energy production levels. (authors)

  4. Traffic Games: Modeling Freeway Traffic with Game Theory.

    Science.gov (United States)

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  5. Comparison of potential models through heavy quark effective theory

    International Nuclear Information System (INIS)

    Amundson, J.F.

    1995-01-01

    I calculate heavy-light decay constants in a nonrelativistic potential model. The resulting estimate of heavy quark symmetry breaking conflicts with similar estimates from lattice QCD. I show that a semirelativistic potential model eliminates the conflict. Using the results of heavy quark effective theory allows me to identify and compensate for shortcomings in the model calculations in addition to isolating the source of the differences in the two models. The results lead to a rule as to where the nonrelativistic quark model gives misleading predictions

  6. Model building with a dynamical volume element in gravity, particle theory and theories of extended object

    International Nuclear Information System (INIS)

    Guendelman, E.

    2004-01-01

    Full Text:The Volume Element of Space Time can be considered as a geometrical object which can be independent of the metric. The use in the action of a volume element which is metric independent leads to the appearance of a measure of integration which is metric independent. This can be applied to all known generally coordinate invariant theories, we will discuss three very important cases: 1. 4-D theories describing gravity and matter fields, 2. Parametrization invariant theories of extended objects and 3. Higher dimensional theories including gravity and matter fields. In case 1, a large number of new effects appear: (i) spontaneous breaking of scale invariance associated to integration of degrees of freedom related to the measure, (ii) under normal particle physics laboratory conditions fermions split into three families, but when matter is highly diluted, neutrinos increase their mass and become suitable candidates for dark matter, (iii) cosmic coincidence between dark energy and dark matter is natural, (iv) quintessence scenarios with automatic decoupling of the quintessence scalar to ordinary matter, but not dark matter are obtained (2) For theories or extended objects, the use of a measure of integration independent of the metric leads to (i) dynamical tension, (ii) string models of non abelian confinement (iii) The possibility of new Weyl invariant light-like branes (WTT.L branes). These Will branes dynamically adjust themselves to sit at black hole horizons and in the context of higher dimensional theories can provide examples of massless 4-D particles with nontrivial Kaluza Klein quantum numbers, (3) In Bronx and Kaluza Klein scenarios, the use of a measure independent of the metric makes it possible to construct naturally models where only the extra dimensions get curved and the 4-D observable space-time remain flat

  7. Theory of positive disintegration as a model of adolescent development.

    Science.gov (United States)

    Laycraft, Krystyna

    2011-01-01

    This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.

  8. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  9. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai

    2014-05-01

    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  10. Modelling machine ensembles with discrete event dynamical system theory

    Science.gov (United States)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  11. Theory, modeling, and integrated studies in the Arase (ERG) project

    Science.gov (United States)

    Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa

    2018-02-01

    Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.

  12. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...

  13. Cohomological gauge theory, quiver matrix models and Donaldson-Thomas theoryCohomological gauge theory, quiver matrix models and Donaldson-Thomas theory

    NARCIS (Netherlands)

    Cirafici, M.; Sinkovics, A.; Szabo, R.J.

    2009-01-01

    We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques

  14. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  15. Excellence in Physics Education Award: Modeling Theory for Physics Instruction

    Science.gov (United States)

    Hestenes, David

    2014-03-01

    All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.

  16. A Model of Statistics Performance Based on Achievement Goal Theory.

    Science.gov (United States)

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  17. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...

  18. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been studied and some assumptions among the ...

  19. Two-dimensional models in statistical mechanics and field theory

    International Nuclear Information System (INIS)

    Koberle, R.

    1980-01-01

    Several features of two-dimensional models in statistical mechanics and Field theory, such as, lattice quantum chromodynamics, Z(N), Gross-Neveu and CP N-1 are discussed. The problems of confinement and dynamical mass generation are also analyzed. (L.C.) [pt

  20. The early years of string theory: The dual resonance model

    International Nuclear Information System (INIS)

    Ramond, P.

    1987-10-01

    This paper reviews the past quantum mechanical history of the dual resonance model which is an early string theory. The content of this paper is listed as follows: historical review, the Veneziano amplitude, the operator formalism, the ghost story, and the string story

  1. Interacting bosons model and relation with BCS theory

    International Nuclear Information System (INIS)

    Diniz, R.

    1990-01-01

    The Nambu mechanism for BCS theory is extended with inclusion of quadrupole pairing in addition to the usual monopole pairing. An effective Hamiltonian is constructed and its relation to the IBM is discussed. The faced difficulties and a possible generalization of this model are discussed. (author)

  2. Symmetry-guided large-scale shell-model theory

    Czech Academy of Sciences Publication Activity Database

    Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.

    2016-01-01

    Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell -model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016

  3. The Five-Factor Model and Self-Determination Theory

    DEFF Research Database (Denmark)

    Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette

    This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...

  4. A Proposed Model of Jazz Theory Knowledge Acquisition

    Science.gov (United States)

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  5. S matrix theory of the massive Thirring model

    International Nuclear Information System (INIS)

    Berg, B.

    1980-01-01

    The S matrix theory of the massive Thirring model, describing the exact quantum scattering of solitons and their boundstates, is reviewed. Treated are: Factorization equations and their solution, boundstates, generalized Jost functions and Levinson's theorem, scattering of boundstates, 'virtual' and anomalous thresholds. (orig.) 891 HSI/orig. 892 MKO

  6. Using SAS PROC MCMC for Item Response Theory Models

    Science.gov (United States)

    Ames, Allison J.; Samonte, Kelli

    2015-01-01

    Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…

  7. Multilevel Higher-Order Item Response Theory Models

    Science.gov (United States)

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  8. Item Response Theory Models for Performance Decline during Testing

    Science.gov (United States)

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…

  9. Item Response Theory Modeling of the Philadelphia Naming Test

    Science.gov (United States)

    Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D.

    2015-01-01

    Purpose: In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating…

  10. An NCME Instructional Module on Polytomous Item Response Theory Models

    Science.gov (United States)

    Penfield, Randall David

    2014-01-01

    A polytomous item is one for which the responses are scored according to three or more categories. Given the increasing use of polytomous items in assessment practices, item response theory (IRT) models specialized for polytomous items are becoming increasingly common. The purpose of this ITEMS module is to provide an accessible overview of…

  11. Profiles in Leadership: Enhancing Learning through Model and Theory Building.

    Science.gov (United States)

    Mello, Jeffrey A.

    2003-01-01

    A class assignment was designed to present factors affecting leadership dynamics, allow practice in model and theory building, and examine leadership from multicultural perspectives. Students developed a profile of a fictional or real leader and analyzed qualities, motivations, context, and effectiveness in written and oral presentations.…

  12. Compositional models and conditional independence in evidence theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim; Vejnarová, Jiřina

    2011-01-01

    Roč. 52, č. 3 (2011), s. 316-334 ISSN 0888-613X Institutional research plan: CEZ:AV0Z10750506 Keywords : Evidence theory * Conditional independence * multidimensional models Subject RIV: BA - General Mathematics Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-0370515.pdf

  13. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  14. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  15. Auto-associative Kernel Regression Model with Weighted Distance Metric for Instrument Drift Monitoring

    International Nuclear Information System (INIS)

    Shin, Ho Cheol; Park, Moon Ghu; You, Skin

    2006-01-01

    Recently, many on-line approaches to instrument channel surveillance (drift monitoring and fault detection) have been reported worldwide. On-line monitoring (OLM) method evaluates instrument channel performance by assessing its consistency with other plant indications through parametric or non-parametric models. The heart of an OLM system is the model giving an estimate of the true process parameter value against individual measurements. This model gives process parameter estimate calculated as a function of other plant measurements which can be used to identify small sensor drifts that would require the sensor to be manually calibrated or replaced. This paper describes an improvement of auto associative kernel regression (AAKR) by introducing a correlation coefficient weighting on kernel distances. The prediction performance of the developed method is compared with conventional auto-associative kernel regression

  16. Conformal field theories, Coulomb gas picture and integrable models

    International Nuclear Information System (INIS)

    Zuber, J.B.

    1988-01-01

    The aim of the study is to present the links between some results of conformal field theory, the conventional Coulomb gas picture in statistical mechanics and the approach of integrable models. It is shown that families of conformal theories, related by the coset construction to the SU(2) Kac-Moody algebra, may be regarded as obtained from some free field, and modified by the coupling of its winding numbers to floating charges. This representation reflects the procedure of restriction of the corresponding integrable lattice models. The work may be generalized to models based on the coset construction with higher rank algebras. The corresponding integrable models are identified. In the conformal field description, generalized parafermions appear, and are coupled to free fields living on a higher-dimensional torus. The analysis is not as exhaustive as in the SU(2) case: all the various restrictions have not been identified, nor the modular invariants completely classified

  17. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  18. The sound of oscillating air jets: Physics, modeling and simulation in flute-like instruments

    Science.gov (United States)

    de La Cuadra, Patricio

    Flute-like instruments share a common mechanism that consists of blowing across one open end of a resonator to produce an air jet that is directed towards a sharp edge. Analysis of its operation involves various research fields including fluid dynamics, aero-acoustics, and physics. An effort has been made in this study to extend this description from instruments with fixed geometry like recorders and organ pipes to flutes played by the lips. An analysis of the jet's response to a periodic excitation is the focus of this study, as are the parameters under the player's control in forming the jet. The jet is excited with a controlled excitation consisting of two loudspeakers in opposite phase. A Schlieren system is used to visualize the jet, and image detection algorithms are developed to extract quantitative information from the images. In order to study the behavior of jets observed in different flute-like instruments, several geometries of the excitation and jet shapes are studied. The obtained data is used to propose analytical models that correctly fit the observed measurements and can be used for simulations. The control exerted by the performer on the instrument is of crucial importance in the quality of the sound produced for a number of flute-like instruments. The case of the transverse flute is experimentally studied. An ensemble of control parameters are measured and visualized in order to describe some aspects of the subtle control attained by an experienced flautist. Contrasting data from a novice flautist are compared. As a result, typical values for several non-dimensional parameters that characterize the normal operation of the instrument have been measured, and data to feed simulations has been collected. The information obtained through experimentation is combined with research developed over the last decades to put together a time-domain simulation. The model proposed is one-dimensional and driven by a single physical input. All the variables in the

  19. Modeling Composite Assessment Data Using Item Response Theory

    Science.gov (United States)

    Ueckert, Sebastian

    2018-01-01

    Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119

  20. Constitutive relationships and models in continuum theories of multiphase flows

    International Nuclear Information System (INIS)

    Decker, R.

    1989-09-01

    In April, 1989, a workshop on constitutive relationships and models in continuum theories of multiphase flows was held at NASA's Marshall Space Flight Center. Topics of constitutive relationships for the partial or per phase stresses, including the concept of solid phase pressure are discussed. Models used for the exchange of mass, momentum, and energy between the phases in a multiphase flow are also discussed. The program, abstracts, and texts of the presentations from the workshop are included

  1. Perturbation theory around the Wess-Zumino-Witten model

    International Nuclear Information System (INIS)

    Hasseln, H. v.

    1991-05-01

    We consider a perturbation of the Wess-Zumino-Witten model in 2D by a current-current interaction. The β-function is computed to third order in the coupling constant and a nontrivial fixedpoint is found. By non-abelian bosonization, this perturbed WZW-model is shown to have the same β-function (at least to order g 2 ) as the fermionic theory with a four-fermion interaction. (orig.) [de

  2. A Theory-Based Model for Understanding Faculty Intention to Use Students Ratings to Improve Teaching in a Health Sciences Institution in Puerto Rico

    Science.gov (United States)

    Collazo, Andrés A.

    2018-01-01

    A model derived from the theory of planned behavior was empirically assessed for understanding faculty intention to use student ratings for teaching improvement. A sample of 175 professors participated in the study. The model was statistically significant and had a very large explanatory power. Instrumental attitude, affective attitude, perceived…

  3. A general-model-space diagrammatic perturbation theory

    International Nuclear Information System (INIS)

    Hose, G.; Kaldor, U.

    1980-01-01

    A diagrammatic many-body perturbation theory applicable to arbitrary model spaces is presented. The necessity of having a complete model space (all possible occupancies of the partially-filled shells) is avoided. This requirement may be troublesome for systems with several well-spaced open shells, such as most atomic and molecular excited states, as a complete model space spans a very broad energy range and leaves out states within that range, leading to poor or no convergence of the perturbation series. The method presented here would be particularly useful for such states. The solution of a model problem (He 2 excited Σ + sub(g) states) is demonstrated. (Auth.)

  4. Theory-based Bayesian models of inductive learning and reasoning.

    Science.gov (United States)

    Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

    2006-07-01

    Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

  5. Fluid analog model for boundary effects in field theory

    International Nuclear Information System (INIS)

    Ford, L. H.; Svaiter, N. F.

    2009-01-01

    Quantum fluctuations in the density of a fluid with a linear phonon dispersion relation are studied. In particular, we treat the changes in these fluctuations due to nonclassical states of phonons and to the presence of boundaries. These effects are analogous to similar effects in relativistic quantum field theory, and we argue that the case of the fluid is a useful analog model for effects in field theory. We further argue that the changes in the mean squared density are, in principle, observable by light scattering experiments.

  6. Chern-Simons Theory, Matrix Models, and Topological Strings

    International Nuclear Information System (INIS)

    Walcher, J

    2006-01-01

    This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may

  7. Finite-size scaling theory and quantum hamiltonian Field theory: the transverse Ising model

    International Nuclear Information System (INIS)

    Hamer, C.J.; Barber, M.N.

    1979-01-01

    Exact results for the mass gap, specific heat and susceptibility of the one-dimensional transverse Ising model on a finite lattice are generated by constructing a finite matrix representation of the Hamiltonian using strong-coupling eigenstates. The critical behaviour of the limiting infinite chain is analysed using finite-size scaling theory. In this way, excellent estimates (to within 1/2% accuracy) are found for the critical coupling and the exponents α, ν and γ

  8. A General Framework for Portfolio Theory. Part I: theory and various models

    OpenAIRE

    Maier-Paape, Stanislaus; Zhu, Qiji Jim

    2017-01-01

    Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz [H. Markowitz, Portfolio Selection, 1959] and its natural generalization, the capital market pricing model, [W. F. Sharpe, Mutual fund performance , 1966] are spe...

  9. Should the model for risk-informed regulation be game theory rather than decision theory?

    Science.gov (United States)

    Bier, Vicki M; Lin, Shi-Woei

    2013-02-01

    deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.

  10. sigma model approach to the heterotic string theory

    International Nuclear Information System (INIS)

    Sen, A.

    1985-09-01

    Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs

  11. Integrable lambda models and Chern-Simons theories

    International Nuclear Information System (INIS)

    Schmidtt, David M.

    2017-01-01

    In this note we reveal a connection between the phase space of lambda models on S 1 ×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 ×S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  12. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  13. Integrable lambda models and Chern-Simons theories

    Energy Technology Data Exchange (ETDEWEB)

    Schmidtt, David M. [Departamento de Física, Universidade Federal de São Carlos,Caixa Postal 676, CEP 13565-905, São Carlos-SP (Brazil)

    2017-05-03

    In this note we reveal a connection between the phase space of lambda models on S{sup 1}×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS{sub 5}×S{sup 5} lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  14. Classical nucleation theory in the phase-field crystal model.

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  15. Matrix models and stochastic growth in Donaldson-Thomas theory

    Energy Technology Data Exchange (ETDEWEB)

    Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)

    2012-10-15

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  16. Matrix models and stochastic growth in Donaldson-Thomas theory

    International Nuclear Information System (INIS)

    Szabo, Richard J.; Tierz, Miguel

    2012-01-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  17. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  18. Soliton excitations in polyacetylene and relativistic field theory models

    International Nuclear Information System (INIS)

    Campbell, D.K.; Bishop, A.R.; Los Alamos Scientific Lab., NM

    1982-01-01

    A continuum model of a Peierls-dimerized chain, as described generally by Brazovskii and discussed for the case of polyacetylene by Takayama, Lin-Liu and Maki (TLM), is considered. The continuum (Bogliubov-de Gennes) equations arising in this model of interacting electrons and phonons are shown to be equivalent to the static, semiclassical equations for a solvable model field theory of self-coupled fermions - the N = 2 Gross-Neveu model. Based on this equivalence we note the existence of soliton defect states in polyacetylene that are additional to, and qualitatively different from, the amplitude kinks commonly discussed. The new solutions do not have the topological stability of kinks but are essentially conventional strong-coupling polarons in the dimerized chain. They carry spin (1/2) and charge (+- e). In addition, we discuss further areas in which known field theory results may apply to a Peierls-dimerized chain, including relations between phenomenological PHI 4 and continuuum electron-phonon models, and the structure of the fully quantum versus mean field theories. (orig.)

  19. Classical nucleation theory in the phase-field crystal model

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  20. Functional capabilities of the breadboard model of SIDRA satellite-borne instrument

    International Nuclear Information System (INIS)

    Dudnik, O.V.; Kurbatov, E.V.; Titov, K.G.; Prieto, M.; Sanchez, S.; Sylwester, J.; Gburek, S.; Podgorski, P.

    2013-01-01

    This paper presents the structure, principles of operation and functional capabilities of the breadboard model of SIDRA compact satellite-borne instrument. SIDRA is intended for monitoring fluxes of high-energy charged particles under outer-space conditions. We present the reasons to develop a particle spectrometer and we list the main objectives to be achieved with the help of this instrument. The paper describes the major specifications of the analog and digital signal processing units of the breadboard model. A specially designed and developed data processing module based on the Actel ProAsic3E A3PE3000 FPGA is presented and compared with the all-in one digital processing signal board based on the Xilinx Spartan 3 XC3S1500 FPGA.

  1. Theory and theory-based models for the pedestal, edge stability and ELMs in tokamaks

    International Nuclear Information System (INIS)

    Guzdar, P.N.; Mahajan, S.M.; Yoshida, Z.; Dorland, W.; Rogers, B.N.; Bateman, G.; Kritz, A.H.; Pankin, A.; Voitsekhovitch, I.; Onjun, T.; Snyder, S.

    2005-01-01

    Theories for equilibrium and stability of H-modes, and models for use within integrated modeling codes with the objective of predicting the height, width and shape of the pedestal at the edge of H-mode plasmas in tokamaks, as well as the onset and frequency of Edge Localized Modes (ELMs), are developed. A theory model for relaxed plasma states with flow, which uses two-fluid Hall-MHD equations, predicts that the natural scale length of the pedestal is the ion skin depth and the pedestal width is larger than the ion poloidal gyro-radius, in agreement with experimental observations. Computations with the GS2 code are used to identify micro-instabilities, such as electron drift waves, that survive the strong flow shear, diamagnetic flows, and magnetic shear that are characteristic of the pedestal. Other instabilities on the pedestal and gyro-radius scale, such as the Kelvin-Helmholtz instability, are also investigated. Time-dependent integrated modeling simulations are used to follow the transition from L-mode to H-mode and the subsequent evolution of ELMs as the heating power is increased. The flow shear stabilization that produces the transport barrier at the edge of the plasma reduces different modes of anomalous transport and, consequently, different channels of transport at different rates. ELM crashes are triggered in the model by pressure-driven ballooning modes or by current-driven peeling modes. (author)

  2. Hypersurface Homogeneous Cosmological Model in Modified Theory of Gravitation

    Science.gov (United States)

    Katore, S. D.; Hatkar, S. P.; Baxi, R. J.

    2016-12-01

    We study a hypersurface homogeneous space-time in the framework of the f (R, T) theory of gravitation in the presence of a perfect fluid. Exact solutions of field equations are obtained for exponential and power law volumetric expansions. We also solve the field equations by assuming the proportionality relation between the shear scalar (σ ) and the expansion scalar (θ ). It is observed that in the exponential model, the universe approaches isotropy at large time (late universe). The investigated model is notably accelerating and expanding. The physical and geometrical properties of the investigated model are also discussed.

  3. Categories of relations as models of quantum theory

    Directory of Open Access Journals (Sweden)

    Chris Heunen

    2015-11-01

    Full Text Available Categories of relations over a regular category form a family of models of quantum theory. Using regular logic, many properties of relations over sets lift to these models, including the correspondence between Frobenius structures and internal groupoids. Over compact Hausdorff spaces, this lifting gives continuous symmetric encryption. Over a regular Mal'cev category, this correspondence gives a characterization of categories of completely positive maps, enabling the formulation of quantum features. These models are closer to Hilbert spaces than relations over sets in several respects: Heisenberg uncertainty, impossibility of broadcasting, and behavedness of rank one morphisms.

  4. Massive mu pair production in a vector field theory model

    CERN Document Server

    Halliday, I G

    1976-01-01

    Massive electrodynamics is treated as a model for the production of massive mu pairs in high-energy hadronic collisions. The dominant diagrams in perturbation theory are identified and analyzed. These graphs have an eikonal structure which leads to enormous cancellations in the two-particle inclusive cross section but not in the n-particle production cross sections. Under the assumption that these cancellations are complete, a Drell-Yan structure appears in the inclusive cross section but the particles accompanying the mu pairs have a very different structure compared to the parton model. The pionization region is no longer empty of particles as in single parton models. (10 refs).

  5. Supersymmetric sigma models and composite Yang-Mills theory

    International Nuclear Information System (INIS)

    Lukierski, J.

    1980-04-01

    We describe two types of supersymmetric sigma models: with field values in supercoset space and with superfields. The notion of Riemannian symmetric pair (H,G/H) is generalized to supergroups. Using the supercoset approach the superconformal-invariant model of composite U(n) Yang-Mills fields in introduced. In the framework of the superfield approach we present with some details two versions of the composite N=1 supersymmetric Yang-Mills theory in four dimensions with U(n) and U(m) x U(n) local invariance. We argue that especially the superfield sigma models can be used for the description of pre-QCD supersymmetric dynamics. (author)

  6. Approximate models for broken clouds in stochastic radiative transfer theory

    International Nuclear Information System (INIS)

    Doicu, Adrian; Efremenko, Dmitry S.; Loyola, Diego; Trautmann, Thomas

    2014-01-01

    This paper presents approximate models in stochastic radiative transfer theory. The independent column approximation and its modified version with a solar source computed in a full three-dimensional atmosphere are formulated in a stochastic framework and for arbitrary cloud statistics. The nth-order stochastic models describing the independent column approximations are equivalent to the nth-order stochastic models for the original radiance fields in which the gradient vectors are neglected. Fast approximate models are further derived on the basis of zeroth-order stochastic models and the independent column approximation. The so-called “internal mixing” models assume a combination of the optical properties of the cloud and the clear sky, while the “external mixing” models assume a combination of the radiances corresponding to completely overcast and clear skies. A consistent treatment of internal and external mixing models is provided, and a new parameterization of the closure coefficient in the effective thickness approximation is given. An efficient computation of the closure coefficient for internal mixing models, using a previously derived vector stochastic model as a reference, is also presented. Equipped with appropriate look-up tables for the closure coefficient, these models can easily be integrated into operational trace gas retrieval systems that exploit absorption features in the near-IR solar spectrum. - Highlights: • Independent column approximation in a stochastic setting. • Fast internal and external mixing models for total and diffuse radiances. • Efficient optimization of internal mixing models to match reference models

  7. Modeling 13.3nm Fe XXIII Flare Emissions Using the GOES-R EXIS Instrument

    Science.gov (United States)

    Rook, H.; Thiemann, E.

    2017-12-01

    The solar EUV spectrum is dominated by atomic transitions in ionized atoms in the solar atmosphere. As solar flares evolve, plasma temperatures and densities change, influencing abundances of various ions, changing intensities of different EUV wavelengths observed from the sun. Quantifying solar flare spectral irradiance is important for constraining models of Earth's atmosphere, improving communications quality, and controlling satellite navigation. However, high time cadence measurements of flare irradiance across the entire EUV spectrum were not available prior to the launch of SDO. The EVE MEGS-A instrument aboard SDO collected 0.1nm EUV spectrum data from 2010 until 2014, when the instrument failed. No current or future instrument is capable of similar high resolution and time cadence EUV observation. This necessitates a full EUV spectrum model to study EUV phenomena at Earth. It has been recently demonstrated that one hot flare EUV line, such as the 13.3nm Fe XXIII line, can be used to model cooler flare EUV line emissions, filling the role of MEGS-A. Since unblended measurements of Fe XXIII are typically unavailable, a proxy for the Fe XXIII line must be found. In this study, we construct two models of this line, first using the GOES 0.1-0.8nm soft x-ray (SXR) channel as the Fe XXIII proxy, and second using a physics-based model dependent on GOES emission measure and temperature data. We determine that the more sophisticated physics-based model shows better agreement with Fe XXIII measurements, although the simple proxy model also performs well. We also conclude that the high correlation between Fe XXIII emissions and the GOES 0.1-0.8nm band is because both emissions tend to peak near the GOES emission measure peak despite large differences in their contribution functions.

  8. A historical overview of nuclear structure studies in Strasbourg Laboratories: instrumentation, measurements and theory modelling—hand-in-hand

    Science.gov (United States)

    Beck, F. A.

    2018-05-01

    This article overviews a long period of an important evolution in the nuclear structure research in Strasbourg Laboratories, focussed on tracking of the weaker and weaker experimental signals carrying the more and more important physics messages. In particular we address the research of signatures of the collective behaviour of the nucleus as suggested in the early works of Bohr, Mottelson and Rainwater—at high and very high angular momenta—as well as the competition between the collective and non-collective excitation modes. These ambitious goals were possible to achieve only by constructing powerful spectrometers and developing related detectors, electronics and data acquisition systems. The theoretical modelling developed in parallel, provided essential guidance when choosing the right experiments and optimising their realisation. Theory calculations were equally helpful in interpreting the results of experiments leading to a more complete understanding of the underlying physics. Moreover, thanks to the development of heavy ion accelerators, the Strasbourg centre was the place where crossed the ways of many experimenters from European countries both from the Western and from the Central part of Europe, the place of the gradual development of more and more sophisticated European gamma-spectrometers in collaboration with more and more laboratories from the increasing number of countries allowing for the frontier-level studies of the nuclear behaviour at very high angular momenta.

  9. Symmetry Breaking, Unification, and Theories Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasunori

    2009-07-31

    A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.

  10. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  11. Goal-directed behaviour and instrumental devaluation: a neural system-level computational model

    Directory of Open Access Journals (Sweden)

    Francesco Mannella

    2016-10-01

    Full Text Available Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviours guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers activate the representation of rewards (or `action-outcomes', e.g. foods while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods. The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b the three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and integrates the results of different devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behaviour.

  12. ExoMars Trace Gas Orbiter Instrument Modelling Approach to Streamline Science Operations

    Science.gov (United States)

    Munoz Fernandez, Michela; Frew, David; Ashman, Michael; Cardesin Moinelo, Alejandro; Garcia Beteta, Juan Jose; Geiger, Bernhard; Metcalfe, Leo; Nespoli, Federico; Muniz Solaz, Carlos

    2018-05-01

    ExoMars Trace Gas Orbiter (TGO) science operations activities are centralised at ESAC's Science Operations Centre (SOC). The SOC receives the inputs from the principal investigators (PIs) in order to implement and deliver the spacecraft pointing requests and instrument timelines to the Mission Operations Centre (MOC). The high number of orbits per planning cycle has made it necessary to abstract the planning interactions between the SOC and the PI teams at the observation level. This paper describes the modelling approach we have conducted for TGOís instruments to streamline science operations. We have created dynamic observation types that scale to adapt to the conditions specified by the PI teams including observation timing, and pointing block parameters calculated from observation geometry. This approach is considered and improvement with respect to previous missions where the generation of the observation pointing and commanding requests was performed manually by the instrument teams. Automation software assists us to effectively handle the high density of planned orbits with increasing volume of scientific data and to successfully meet opportunistic scientific goals and objectives. Our planning tool combines the instrument observation definition files provided by the PIs together with the flight dynamics products to generate the Pointing Requests and the instrument timeline (ITL). The ITL contains all the validated commands at the TC sequence level and computes the resource envelopes (data rate, power, data volume) within the constraints. At the SOC, our main goal is to maximise the science output while minimising the number of iterations among the teams, ensuring that the timeline does not violate the state transitions allowed in the Mission Operations Rules and Constraints Document.

  13. Nonlinear structural mechanics theory, dynamical phenomena and modeling

    CERN Document Server

    Lacarbonara, Walter

    2013-01-01

    Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...

  14. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    Science.gov (United States)

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  15. The QCD model of hadron cores of the meson theory

    International Nuclear Information System (INIS)

    Pokrovskii, Y.E.

    1985-01-01

    It was shown that in the previously proposed QCD model of hadron cores the exchange and self-energy contributions of the virtual quark-antiquark-gluon cloud on the outside of a bag which radius coincides with the hardon core radius of the meson theory (∼ 0.4 Fm) have been taken into account at the phenomenological level. Simulation of this cloud by the meson field results in realistic estimations of the nucleon's electroweak properties, moment fractions carried by gluons, quarks, antiquarks and hadron-hadron interaction cross-sections within a wide range of energies. The authors note that the QCD hadron core model proposed earlier not only realistically reflects the hadron masses, but reflects self-consistently main elements of the structure and interaction of hadrons at the quark-gluon bag radius (R - 0.4Fm) being close to the meson theory core radius

  16. Synthetic Domain Theory and Models of Linear Abadi & Plotkin Logic

    DEFF Research Database (Denmark)

    Møgelberg, Rasmus Ejlers; Birkedal, Lars; Rosolini, Guiseppe

    2008-01-01

    Plotkin suggested using a polymorphic dual intuitionistic/linear type theory (PILLY) as a metalanguage for parametric polymorphism and recursion. In recent work the first two authors and R.L. Petersen have defined a notion of parametric LAPL-structure, which are models of PILLY, in which one can...... reason using parametricity and, for example, solve a large class of domain equations, as suggested by Plotkin.In this paper, we show how an interpretation of a strict version of Bierman, Pitts and Russo's language Lily into synthetic domain theory presented by Simpson and Rosolini gives rise...... to a parametric LAPL-structure. This adds to the evidence that the notion of LAPL-structure is a general notion, suitable for treating many different parametric models, and it provides formal proofs of consequences of parametricity expected to hold for the interpretation. Finally, we show how these results...

  17. Design, Modelling and Teleoperation of a 2 mm Diameter Compliant Instrument for the da Vinci Platform.

    Science.gov (United States)

    Francis, P; Eastwood, K W; Bodani, V; Looi, T; Drake, J M

    2018-05-07

    This work explores the feasibility of creating and accurately controlling an instrument for robotic surgery with a 2 mm diameter and a three degree-of-freedom (DoF) wrist which is compatible with the da Vinci platform. The instrument's wrist is composed of a two DoF bending notched-nitinol tube pattern, for which a kinematic model has been developed. A base mechanism for controlling the wrist is designed for integration with the da Vinci Research Kit. A basic teleoperation task is successfully performed using two of the miniature instruments. The performance and accuracy of the instrument suggest that creating and accurately controlling a 2 mm diameter instrument is feasible and the design and modelling proposed in this work provide a basis for future miniature instrument development.

  18. Designing means and specifications for model FT-619 kidney function instrument

    International Nuclear Information System (INIS)

    Yu Yongding

    1988-04-01

    In this paper, it is pointed out that the model FT-619 Kidney Function Equipment is a new cost-effective nuclear medicine instrument, which takes the leading position in China. The performance of the model FT-619,especially the lead collimated scintillation detector has reached the same level as the advanced equipment in the world market. It is also described in this article in detail that the design of the lead collimator and the shielding as well as the detection efficiency have achieved the optimum level and that a comparison has been made with foreign products

  19. Thermal Modeling of the Mars Reconnaissance Orbiter's Solar Panel and Instruments during Aerobraking

    Science.gov (United States)

    Dec, John A.; Gasbarre, Joseph F.; Amundsen, Ruth M.

    2007-01-01

    The Mars Reconnaissance Orbiter (MRO) launched on August 12, 2005 and started aerobraking at Mars in March 2006. During the spacecraft s design phase, thermal models of the solar panels and instruments were developed to determine which components would be the most limiting thermally during aerobraking. Having determined the most limiting components, thermal limits in terms of heat rate were established. Advanced thermal modeling techniques were developed utilizing Thermal Desktop and Patran Thermal. Heat transfer coefficients were calculated using a Direct Simulation Monte Carlo technique. Analysis established that the solar panels were the most limiting components during the aerobraking phase of the mission.

  20. SIMP model at NNLO in chiral perturbation theory

    DEFF Research Database (Denmark)

    Hansen, Martin Rasmus Lundquist; Langaeble, K.; Sannino, F.

    2015-01-01

    We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 to 2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles...... with phenomenological constraints challenging the viability of the simplest realisation of the strongly interacting massive particle (SIMP) paradigm....

  1. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  2. A model theory for tachyons in two dimensions

    International Nuclear Information System (INIS)

    Recami, E.; Rodrigues, W.A.

    1985-01-01

    The paper is divided in two parts, the first one having nothing to do with tachyons. In fact, to prepare the ground, in part one (sect. 2) it is shown that special relativity, even without tachyons, can be given a form such to describe both particles and antiparticles. The plan of part two is confined only to a model theory in two dimensions, for the reasons stated in sect. 3

  3. A realistic model for quantum theory with a locality property

    International Nuclear Information System (INIS)

    Eberhard, P.H.

    1987-04-01

    A model reproducing the predictions of relativistic quantum theory to any desired degree of accuracy is described in this paper. It involves quantities that are independent of the observer's knowledge, and therefore can be called real, and which are defined at each point in space, and therefore can be called local in a rudimentary sense. It involves faster-than-light, but not instantaneous, action at distance

  4. Theory, Modeling and Simulation Annual Report 2000; FINAL

    International Nuclear Information System (INIS)

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-01-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems

  5. Properties of lattice gauge theory models at low temperatures

    International Nuclear Information System (INIS)

    Mack, G.

    1980-01-01

    The Z(N) theory of quark confinement is discussed and how fluctuations of Z(N) gauge fields may continue to be important in the continuum limit. Existence of a model in four dimensions is pointed out in which confinement of (scalar) quarks can be shown to persist in the continuum limit. This article is based on the author's Cargese lectures 1979. Some of its results are published here for the first time. (orig.) 891 HSI/orig. 892 MKO

  6. Field theory of large amplitude collective motion. A schematic model

    International Nuclear Information System (INIS)

    Reinhardt, H.

    1978-01-01

    By using path integral methods the equation for large amplitude collective motion for a schematic two-level model is derived. The original fermion theory is reformulated in terms of a collective (Bose) field. The classical equation of motion for the collective field coincides with the time-dependent Hartree-Fock equation. Its classical solution is quantized by means of the field-theoretical generalization of the WKB method. (author)

  7. Stability Analysis for Car Following Model Based on Control Theory

    International Nuclear Information System (INIS)

    Meng Xiang-Pei; Li Zhi-Peng; Ge Hong-Xia

    2014-01-01

    Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  8. Analytical theory of Doppler reflectometry in slab plasma model

    Energy Technology Data Exchange (ETDEWEB)

    Gusakov, E.Z.; Surkov, A.V. [Ioffe Institute, Politekhnicheskaya 26, St. Petersburg (Russian Federation)

    2004-07-01

    Doppler reflectometry is considered in slab plasma model in the frameworks of analytical theory. The diagnostics locality is analyzed for both regimes: linear and nonlinear in turbulence amplitude. The toroidal antenna focusing of probing beam to the cut-off is proposed and discussed as a method to increase diagnostics spatial resolution. It is shown that even in the case of nonlinear regime of multiple scattering, the diagnostics can be used for an estimation (with certain accuracy) of plasma poloidal rotation profile. (authors)

  9. Spherically symmetric star model in the gravitational gauge theory

    Energy Technology Data Exchange (ETDEWEB)

    Tsou, C [Peking Observatory, China; Ch' en, S; Ho, T; Kuo, H

    1976-12-01

    It is shown that a star model, which is black hole-free and singularity-free, can be obtained naturally in the gravitational gauge theory, provided the space-time is torsion-free and the matter is spinless. The conclusion in a sense shows that the discussions about the black hole and the singularity based on general relativity may not describe nature correctly.

  10. Building Better Ecological Machines: Complexity Theory and Alternative Economic Models

    Directory of Open Access Journals (Sweden)

    Jess Bier

    2016-12-01

    Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.

  11. Thermal Testing and Model Correlation for Advanced Topographic Laser Altimeter Instrument (ATLAS)

    Science.gov (United States)

    Patel, Deepak

    2016-01-01

    The Advanced Topographic Laser Altimeter System (ATLAS) part of the Ice Cloud and Land Elevation Satellite 2 (ICESat-2) is an upcoming Earth Science mission focusing on the effects of climate change. The flight instrument passed all environmental testing at GSFC (Goddard Space Flight Center) and is now ready to be shipped to the spacecraft vendor for integration and testing. This topic covers the analysis leading up to the test setup for ATLAS thermal testing as well as model correlation to flight predictions. Test setup analysis section will include areas where ATLAS could not meet flight like conditions and what were the limitations. Model correlation section will walk through changes that had to be made to the thermal model in order to match test results. The correlated model will then be integrated with spacecraft model for on-orbit predictions.

  12. Incorporation of Markov reliability models for digital instrumentation and control systems into existing PRAs

    International Nuclear Information System (INIS)

    Bucci, P.; Mangan, L. A.; Kirschenbaum, J.; Mandelli, D.; Aldemir, T.; Arndt, S. A.

    2006-01-01

    Markov models have the ability to capture the statistical dependence between failure events that can arise in the presence of complex dynamic interactions between components of digital instrumentation and control systems. One obstacle to the use of such models in an existing probabilistic risk assessment (PRA) is that most of the currently available PRA software is based on the static event-tree/fault-tree methodology which often cannot represent such interactions. We present an approach to the integration of Markov reliability models into existing PRAs by describing the Markov model of a digital steam generator feedwater level control system, how dynamic event trees (DETs) can be generated from the model, and how the DETs can be incorporated into an existing PRA with the SAPHIRE software. (authors)

  13. Using the Rasch measurement model to design a report writing assessment instrument.

    Science.gov (United States)

    Carlson, Wayne R

    2013-01-01

    This paper describes how the Rasch measurement model was used to develop an assessment instrument designed to measure student ability to write law enforcement incident and investigative reports. The ability to write reports is a requirement of all law enforcement recruits in the state of Michigan and is a part of the state's mandatory basic training curriculum, which is promulgated by the Michigan Commission on Law Enforcement Standards (MCOLES). Recently, MCOLES conducted research to modernize its training and testing in the area of report writing. A structured validation process was used, which included: a) an examination of the job tasks of a patrol officer, b) input from content experts, c) a review of the professional research, and d) the creation of an instrument to measure student competency. The Rasch model addressed several measurement principles that were central to construct validity, which were particularly useful for assessing student performances. Based on the results of the report writing validation project, the state established a legitimate connectivity between the report writing standard and the essential job functions of a patrol officer in Michigan. The project also produced an authentic instrument for measuring minimum levels of report writing competency, which generated results that are valid for inferences of student ability. Ultimately, the state of Michigan must ensure the safety of its citizens by licensing only those patrol officers who possess a minimum level of core competency. Maintaining the validity and reliability of both the training and testing processes can ensure that the system for producing such candidates functions as intended.

  14. Transcultural adaptation into Portuguese of an instrument for pain evaluation based on the biopsychosocial model

    Directory of Open Access Journals (Sweden)

    Monique Rocha Peixoto dos Santos

    Full Text Available Abstract Introduction: Pain is an individual experience influenced by multiple interacting factors. The “biopsychosocial” care model has gained popularity in response to growing research evidence indicating the influence of biological, psychological, and social factors on the pain experience. The implementation of this model is a challenge in the practice of the health professional. Objective: To perform the transcultural adaptation of the SCEBS method into Brazilian Portuguese. Methods: The instrument was translated and applied to 50 healthy subjects and 50 participants with non-specific chronic pain in the spine. The process of cross-cultural adaptation included the following steps: transcultural adaptation, content analysis of the scale, pre-test, revision, back-translation review, cross-cultural adaptation, revised text correction and final report. Results: The translated and adapted 51-item Portuguese version of the SCEBS method produced an instrument called SCEBS-BR. In the assessment by the target population, 50 adult users of the Brazilian Unified Health System answered the questionnaire and showed good understanding of the instrument on the verbal rating scale. Conclusion: The SCEBS-BR was proved to be easily understandable, showing good semantic validation regardless of schooling level or age, and can be considered adequate for clinical use.

  15. Noncommutative gauge theory and symmetry breaking in matrix models

    International Nuclear Information System (INIS)

    Grosse, Harald; Steinacker, Harold; Lizzi, Fedele

    2010-01-01

    We show how the fields and particles of the standard model can be naturally realized in noncommutative gauge theory. Starting with a Yang-Mills matrix model in more than four dimensions, an SU(n) gauge theory on a Moyal-Weyl space arises with all matter and fields in the adjoint of the gauge group. We show how this gauge symmetry can be broken spontaneously down to SU(3) c xSU(2) L xU(1) Q [resp. SU(3) c xU(1) Q ], which couples appropriately to all fields in the standard model. An additional U(1) B gauge group arises which is anomalous at low energies, while the trace-U(1) sector is understood in terms of emergent gravity. A number of additional fields arise, which we assume to be massive, in a pattern that is reminiscent of supersymmetry. The symmetry breaking might arise via spontaneously generated fuzzy spheres, in which case the mechanism is similar to brane constructions in string theory.

  16. Development and design of a late-model fitness test instrument based on LabView

    Science.gov (United States)

    Xie, Ying; Wu, Feiqing

    2010-12-01

    Undergraduates are pioneers of China's modernization program and undertake the historic mission of rejuvenating our nation in the 21st century, whose physical fitness is vital. A smart fitness test system can well help them understand their fitness and health conditions, thus they can choose more suitable approaches and make practical plans for exercising according to their own situation. following the future trends, a Late-model fitness test Instrument based on LabView has been designed to remedy defects of today's instruments. The system hardware consists of fives types of sensors with their peripheral circuits, an acquisition card of NI USB-6251 and a computer, while the system software, on the basis of LabView, includes modules of user register, data acquisition, data process and display, and data storage. The system, featured by modularization and an open structure, is able to be revised according to actual needs. Tests results have verified the system's stability and reliability.

  17. Measuring and modeling salience with the theory of visual attention.

    Science.gov (United States)

    Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid

    2017-08-01

    For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.

  18. Application of the evolution theory in modelling of innovation diffusion

    Directory of Open Access Journals (Sweden)

    Krstić Milan

    2016-01-01

    Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.

  19. H+3 WZNW model from Liouville field theory

    International Nuclear Information System (INIS)

    Hikida, Yasuaki; Schomerus, Volker

    2007-01-01

    There exists an intriguing relation between genus zero correlation functions in the H + 3 WZNW model and in Liouville field theory. We provide a path integral derivation of the correspondence and then use our new approach to generalize the relation to surfaces of arbitrary genus g. In particular we determine the correlation functions of N primary fields in the WZNW model explicitly through Liouville correlators with N+2g-2 additional insertions of certain degenerate fields. The paper concludes with a list of interesting further extensions and a few comments on the relation to the geometric Langlands program

  20. A model for hot electron phenomena: Theory and general results

    International Nuclear Information System (INIS)

    Carrillo, J.L.; Rodriquez, M.A.

    1988-10-01

    We propose a model for the description of the hot electron phenomena in semiconductors. Based on this model we are able to reproduce accurately the main characteristics observed in experiments of electric field transport, optical absorption, steady state photoluminescence and relaxation process. Our theory does not contain free nor adjustable parameters, it is very fast computerwise, and incorporates the main collision mechanisms including screening and phonon heating effects. Our description on a set of nonlinear rate equations in which the interactions are represented by coupling coefficients or effective frequencies. We calculate three coefficients from the characteristic constants and the band structure of the material. (author). 22 refs, 5 figs, 1 tab

  1. A possibilistic uncertainty model in classical reliability theory

    International Nuclear Information System (INIS)

    De Cooman, G.; Capelle, B.

    1994-01-01

    The authors argue that a possibilistic uncertainty model can be used to represent linguistic uncertainty about the states of a system and of its components. Furthermore, the basic properties of the application of this model to classical reliability theory are studied. The notion of the possibilistic reliability of a system or a component is defined. Based on the concept of a binary structure function, the important notion of a possibilistic function is introduced. It allows to calculate the possibilistic reliability of a system in terms of the possibilistic reliabilities of its components

  2. Theory and Circuit Model for Lossy Coaxial Transmission Line

    Energy Technology Data Exchange (ETDEWEB)

    Genoni, T. C.; Anderson, C. N.; Clark, R. E.; Gansz-Torres, J.; Rose, D. V.; Welch, Dale Robert

    2017-04-01

    The theory of signal propagation in lossy coaxial transmission lines is revisited and new approximate analytic formulas for the line impedance and attenuation are derived. The accuracy of these formulas from DC to 100 GHz is demonstrated by comparison to numerical solutions of the exact field equations. Based on this analysis, a new circuit model is described which accurately reproduces the line response over the entire frequency range. Circuit model calculations are in excellent agreement with the numerical and analytic results, and with finite-difference-time-domain simulations which resolve the skindepths of the conducting walls.

  3. Models and applications of chaos theory in modern sciences

    CERN Document Server

    Zeraoulia, Elhadj

    2011-01-01

    This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli

  4. Development and validation of the coronary heart disease scale under the system of quality of life instruments for chronic diseases QLICD-CHD: combinations of classical test theory and Generalizability Theory.

    Science.gov (United States)

    Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong

    2014-06-04

    Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be

  5. Refined pipe theory for mechanistic modeling of wood development.

    Science.gov (United States)

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  6. Modelling of XCO2 Surfaces Based on Flight Tests of TanSat Instruments

    Directory of Open Access Journals (Sweden)

    Li Li Zhang

    2016-11-01

    Full Text Available The TanSat carbon satellite is to be launched at the end of 2016. In order to verify the performance of its instruments, a flight test of TanSat instruments was conducted in Jilin Province in September, 2015. The flight test area covered a total area of about 11,000 km2 and the underlying surface cover included several lakes, forest land, grassland, wetland, farmland, a thermal power plant and numerous cities and villages. We modeled the column-average dry-air mole fraction of atmospheric carbon dioxide (XCO2 surface based on flight test data which measured the near- and short-wave infrared (NIR reflected solar radiation in the absorption bands at around 760 and 1610 nm. However, it is difficult to directly analyze the spatial distribution of XCO2 in the flight area using the limited flight test data and the approximate surface of XCO2, which was obtained by regression modeling, which is not very accurate either. We therefore used the high accuracy surface modeling (HASM platform to fill the gaps where there is no information on XCO2 in the flight test area, which takes the approximate surface of XCO2 as its driving field and the XCO2 observations retrieved from the flight test as its optimum control constraints. High accuracy surfaces of XCO2 were constructed with HASM based on the flight’s observations. The results showed that the mean XCO2 in the flight test area is about 400 ppm and that XCO2 over urban areas is much higher than in other places. Compared with OCO-2’s XCO2, the mean difference is 0.7 ppm and the standard deviation is 0.95 ppm. Therefore, the modelling of the XCO2 surface based on the flight test of the TanSat instruments fell within an expected and acceptable range.

  7. Ares I Scale Model Acoustic Test Instrumentation for Acoustic and Pressure Measurements

    Science.gov (United States)

    Vargas, Magda B.; Counter, Douglas

    2011-01-01

    Ares I Scale Model Acoustic Test (ASMAT) is a 5% scale model test of the Ares I vehicle, launch pad and support structures conducted at MSFC to verify acoustic and ignition environments and evaluate water suppression systems Test design considerations 5% measurements must be scaled to full scale requiring high frequency measurements Users had different frequencies of interest Acoustics: 200 - 2,000 Hz full scale equals 4,000 - 40,000 Hz model scale Ignition Transient: 0 - 100 Hz full scale equals 0 - 2,000 Hz model scale Environment exposure Weather exposure: heat, humidity, thunderstorms, rain, cold and snow Test environments: Plume impingement heat and pressure, and water deluge impingement Several types of sensors were used to measure the environments Different instrument mounts were used according to the location and exposure to the environment This presentation addresses the observed effects of the selected sensors and mount design on the acoustic and pressure measurements

  8. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  9. Two problems from the theory of semiotic control models. I. Representations of semiotic models

    Energy Technology Data Exchange (ETDEWEB)

    Osipov, G S

    1981-11-01

    Two problems from the theory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of themtheory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of them. Algebraic representation of semiotic models, covering of representations, their reduction and equivalence are discussed. The interrelations between functional and structural characteristics of semiotic models are investigated. 20 references.

  10. General topology meets model theory, on p and t.

    Science.gov (United States)

    Malliaris, Maryanthe; Shelah, Saharon

    2013-08-13

    Cantor proved in 1874 [Cantor G (1874) J Reine Angew Math 77:258-262] that the continuum is uncountable, and Hilbert's first problem asks whether it is the smallest uncountable cardinal. A program arose to study cardinal invariants of the continuum, which measure the size of the continuum in various ways. By Gödel [Gödel K (1939) Proc Natl Acad Sci USA 25(4):220-224] and Cohen [Cohen P (1963) Proc Natl Acad Sci USA 50(6):1143-1148], Hilbert's first problem is independent of ZFC (Zermelo-Fraenkel set theory with the axiom of choice). Much work both before and since has been done on inequalities between these cardinal invariants, but some basic questions have remained open despite Cohen's introduction of forcing. The oldest and perhaps most famous of these is whether " p = t," which was proved in a special case by Rothberger [Rothberger F (1948) Fund Math 35:29-46], building on Hausdorff [Hausdorff (1936) Fund Math 26:241-255]. In this paper we explain how our work on the structure of Keisler's order, a large-scale classification problem in model theory, led to the solution of this problem in ZFC as well as of an a priori unrelated open question in model theory.

  11. Item level diagnostics and model - data fit in item response theory ...

    African Journals Online (AJOL)

    Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...

  12. Development and validation of the nasopharyngeal cancer scale among the system of quality of life instruments for cancer patients (QLICP-NA V2.0): combined classical test theory and generalizability theory.

    Science.gov (United States)

    Wu, Jiayuan; Hu, Liren; Zhang, Gaohua; Liang, Qilian; Meng, Qiong; Wan, Chonghua

    2016-08-01

    This research was designed to develop a nasopharyngeal cancer (NPC) scale based on quality of life (QOL) instruments for cancer patients (QLICP-NA). This scale was developed by using a modular approach and was evaluated by classical test and generalizability theories. Programmed decision procedures and theories on instrument development were applied to create QLICP-NA V2.0. A total of 121 NPC inpatients were assessed using QLICP-NA V2.0 to measure their QOL data from hospital admission until discharge. Scale validity, reliability, and responsiveness were evaluated by correlation, factor, parallel, multi-trait scaling, and t test analyses, as well as by generalizability (G) and decision (D) studies of the generalizability theory. Results of multi-trait scaling, correlation, factor, and parallel analyses indicated that QLICP-NA V2.0 exhibited good construct validity. The significant difference of QOL between the treated and untreated NPC patients indicated a good clinical validity of the questionnaire. The internal consistency (α) and test-retest reliability coefficients (intra-class correlations) of each domain, as well as the overall scale, were all >0.70. Ceiling effects were not found in all domains and most facets, except for common side effects (24.8 %) in the domain of common symptoms and side effects, tumor early symptoms (27.3 %) and therapeutic side effects (23.2 %) in specific domain, whereas floor effects did not exist in each domain/facet. The overall changes in the physical and social domains were significantly different between pre- and post-treatments with a moderate effective size (standard response mean) ranging from 0.21 to 0.27 (p theory. QLICP-NA V2.0 exhibited reasonable degrees of validity, reliability, and responsiveness. However, this scale must be further improved before it can be used as a practical instrument to evaluate the QOL of NPC patients in China.

  13. Recognition of risk situations based on endoscopic instrument tracking and knowledge based situation modeling

    Science.gov (United States)

    Speidel, Stefanie; Sudra, Gunther; Senemaud, Julien; Drentschew, Maximilian; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger

    2008-03-01

    Minimally invasive surgery has gained significantly in importance over the last decade due to the numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate these constraints we propose to enhance the surgeon's capabilities by providing a context-aware assistance using augmented reality (AR) techniques. In order to generate a context-aware assistance it is necessary to recognize the current state of the intervention using intraoperatively gained sensor data and a model of the surgical intervention. In this paper we present the recognition of risk situations, the system warns the surgeon if an instrument gets too close to a risk structure. The context-aware assistance system starts with an image-based analysis to retrieve information from the endoscopic images. This information is classified and a semantic description is generated. The description is used to recognize the current state and launch an appropriate AR visualization. In detail we present an automatic vision-based instrument tracking to obtain the positions of the instruments. Situation recognition is performed using a knowledge representation based on a description logic system. Two augmented reality visualization programs are realized to warn the surgeon if a risk situation occurs.

  14. The predictive power of SIMION/SDS simulation software for modeling ion mobility spectrometry instruments

    Science.gov (United States)

    Lai, Hanh; McJunkin, Timothy R.; Miller, Carla J.; Scott, Jill R.; Almirall, José R.

    2008-09-01

    The combined use of SIMION 7.0 and the statistical diffusion simulation (SDS) user program in conjunction with SolidWorks® with COSMSOSFloWorks® fluid dynamics software to model a complete, commercial ion mobility spectrometer (IMS) was demonstrated for the first time and compared to experimental results for tests using compounds of immediate interest in the security industry (e.g., 2,4,6-trinitrotoluene, 2,7-dinitrofluorene, and cocaine). The effort of this research was to evaluate the predictive power of SIMION/SDS for application to IMS instruments. The simulation was evaluated against experimental results in three studies: (1) a drift:carrier gas flow rates study assesses the ability of SIMION/SDS to correctly predict the ion drift times; (2) a drift gas composition study evaluates the accuracy in predicting the resolution; (3) a gate width study compares the simulated peak shape and peak intensity with the experimental values. SIMION/SDS successfully predicted the correct drift time, intensity, and resolution trends for the operating parameters studied. Despite the need for estimations and assumptions in the construction of the simulated instrument, SIMION/SDS was able to predict the resolution between two ion species in air within 3% accuracy. The preliminary success of IMS simulations using SIMION/SDS software holds great promise for the design of future instruments with enhanced performance.

  15. A model-theory for Tachyons in two dimensions

    International Nuclear Information System (INIS)

    Recami, E.; Rodriques, W.A. Jr.

    1986-01-01

    The subject of Tachyons, even if still speculative, may deserve some attention for reasons that can be divided into a few categories, two of which are as follows: The larger scheme, to build up in order to incorporate space-like objects in the relativistic theories. These allow better understanding of many aspects of the ordinary relativistic physics, even if Tachyons would not exist in our cosmos as ''asymptotically free'' objects; superliminal classical objects can have a role in elementary particle interactions (perhaps even in astrophysics) and possible verification of the reproduction of quantum-like behaviour at a classical level when taking into account the possible existence of faster-than-light classical particles. This paper shows that Special Relativity - even without tachyons - can be given a form which describes both particles and anti-particles. This paper also is confined only to a ''model theory'' of Tachyons in two dimensions

  16. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  17. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  18. Effective-field theory on the kinetic Ising model

    International Nuclear Information System (INIS)

    Shi Xiaoling; Wei Guozhu; Li Lin

    2008-01-01

    As an analytical method, the effective-field theory (EFT) is used to study the dynamical response of the kinetic Ising model in the presence of a sinusoidal oscillating field. The effective-field equations of motion of the average magnetization are given for the square lattice (Z=4) and the simple cubic lattice (Z=6), respectively. The dynamic order parameter, the hysteresis loop area and the dynamic correlation are calculated. In the field amplitude h 0 /ZJ-temperature T/ZJ plane, the phase boundary separating the dynamic ordered and the disordered phase has been drawn, and the dynamical tricritical point has been observed. We also make the compare results of EFT with that given by using the mean field theory (MFT)

  19. A study of the logical model of capital market complexity theories

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.

  20. Chern-Simons matrix models, two-dimensional Yang-Mills theory and the Sutherland model

    International Nuclear Information System (INIS)

    Szabo, Richard J; Tierz, Miguel

    2010-01-01

    We derive some new relationships between matrix models of Chern-Simons gauge theory and of two-dimensional Yang-Mills theory. We show that q-integration of the Stieltjes-Wigert matrix model is the discrete matrix model that describes q-deformed Yang-Mills theory on S 2 . We demonstrate that the semiclassical limit of the Chern-Simons matrix model is equivalent to the Gross-Witten model in the weak-coupling phase. We study the strong-coupling limit of the unitary Chern-Simons matrix model and show that it too induces the Gross-Witten model, but as a first-order deformation of Dyson's circular ensemble. We show that the Sutherland model is intimately related to Chern-Simons gauge theory on S 3 , and hence to q-deformed Yang-Mills theory on S 2 . In particular, the ground-state wavefunction of the Sutherland model in its classical equilibrium configuration describes the Chern-Simons free energy. The correspondence is extended to Wilson line observables and to arbitrary simply laced gauge groups.

  1. Plane answers to complex questions the theory of linear models

    CERN Document Server

    Christensen, Ronald

    1987-01-01

    This book was written to rigorously illustrate the practical application of the projective approach to linear models. To some, this may seem contradictory. I contend that it is possible to be both rigorous and illustrative and that it is possible to use the projective approach in practical applications. Therefore, unlike many other books on linear models, the use of projections and sub­ spaces does not stop after the general theory. They are used wherever I could figure out how to do it. Solving normal equations and using calculus (outside of maximum likelihood theory) are anathema to me. This is because I do not believe that they contribute to the understanding of linear models. I have similar feelings about the use of side conditions. Such topics are mentioned when appropriate and thenceforward avoided like the plague. On the other side of the coin, I just as strenuously reject teaching linear models with a coordinate free approach. Although Joe Eaton assures me that the issues in complicated problems freq...

  2. Theory of thermoluminescence gamma dose response: The unified interaction model

    International Nuclear Information System (INIS)

    Horowitz, Y.S.

    2001-01-01

    We describe the development of a comprehensive theory of thermoluminescence (TL) dose response, the unified interaction model (UNIM). The UNIM is based on both radiation absorption stage and recombination stage mechanisms and can describe dose response for heavy charged particles (in the framework of the extended track interaction model - ETIM) as well as for isotropically ionising gamma rays and electrons (in the framework of the TC/LC geminate recombination model) in a unified and self-consistent conceptual and mathematical formalism. A theory of optical absorption dose response is also incorporated in the UNIM to describe the radiation absorption stage. The UNIM is applied to the dose response supralinearity characteristics of LiF:Mg,Ti and is especially and uniquely successful in explaining the ionisation density dependence of the supralinearity of composite peak 5 in TLD-100. The UNIM is demonstrated to be capable of explaining either qualitatively or quantitatively all of the major features of TL dose response with many of the variable parameters of the model strongly constrained by ancilliary optical absorption and sensitisation measurements

  3. Adapting Structuration Theory as a Comprehensive Theory for Distance Education: The ASTIDE Model

    Science.gov (United States)

    Aktaruzzaman, Md; Plunkett, Margaret

    2016-01-01

    Distance Education (DE) theorists have argued about the requirement for a theory to be comprehensive in a way that can explicate many of the activities associated with DE. Currently, Transactional Distance Theory (TDT) (Moore, 1993) and the Theory of Instructional Dialogue (IDT) (Caspi & Gorsky, 2006) are the most prominent theories, yet they…

  4. Modelling non-ignorable missing data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cornelis A.W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  5. Modelling non-ignorable missing-data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cees A. W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  6. 3CE Methodology for Conducting a Modeling, Simulation, and Instrumentation Tool Capability Analysis

    Science.gov (United States)

    2010-05-01

    flRmurn I F )T:Ir,tir)l! MCr)lto.-lng DHin nttbli..’"Ollc:~ E,;m:a..liut .!,)’l’lt’Mn:l’lll.ll~ t Managemen t F unction a l Arem 1 .5 Toola na...a modeling, simulation, and instrumentation (MS&I) environment. This methodology uses the DoDAF product set to document operational and systems...engineering process were identified and resolved, such as duplication of data elements derived from DoDAF operational and system views used to

  7. The pipe model theory half a century on: a review.

    Science.gov (United States)

    Lehnebach, Romain; Beyer, Robert; Letort, Véronique; Heuret, Patrick

    2018-01-23

    More than a half century ago, Shinozaki et al. (Shinozaki K, Yoda K, Hozumi K, Kira T. 1964b. A quantitative analysis of plant form - the pipe model theory. II. Further evidence of the theory and its application in forest ecology. Japanese Journal of Ecology14: 133-139) proposed an elegant conceptual framework, the pipe model theory (PMT), to interpret the observed linear relationship between the amount of stem tissue and corresponding supported leaves. The PMT brought a satisfactory answer to two vividly debated problems that were unresolved at the moment of its publication: (1) What determines tree form and which rules drive biomass allocation to the foliar versus stem compartments in plants? (2) How can foliar area or mass in an individual plant, in a stand or at even larger scales be estimated? Since its initial formulation, the PMT has been reinterpreted and used in applications, and has undoubtedly become an important milestone in the mathematical interpretation of plant form and functioning. This article aims to review the PMT by going back to its initial formulation, stating its explicit and implicit properties and discussing them in the light of current biological knowledge and experimental evidence in order to identify the validity and range of applicability of the theory. We also discuss the use of the theory in tree biomechanics and hydraulics as well as in functional-structural plant modelling. Scrutinizing the PMT in the light of modern biological knowledge revealed that most of its properties are not valid as a general rule. The hydraulic framework derived from the PMT has attracted much more attention than its mechanical counterpart and implies that only the conductive portion of a stem cross-section should be proportional to the supported foliage amount rather than the whole of it. The facts that this conductive portion is experimentally difficult to measure and varies with environmental conditions and tree ontogeny might cause the commonly

  8. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    1998-01-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  9. Developing a workplace resilience instrument.

    Science.gov (United States)

    Mallak, Larry A; Yildiz, Mustafa

    2016-05-27

    Resilience benefits from the use of protective factors, as opposed to risk factors, which are associated with vulnerability. Considerable research and instrument development has been conducted in clinical settings for patients. The need existed for an instrument to be developed in a workplace setting to measure resilience of employees. This study developed and tested a resilience instrument for employees in the workplace. The research instrument was distributed to executives and nurses working in the United States in hospital settings. Five-hundred-forty completed and usable responses were obtained. The instrument contained an inventory of workplace resilience, a job stress questionnaire, and relevant demographics. The resilience items were written based on previous work by the lead author and inspired by Weick's [1] sense-making theory. A four-factor model yielded an instrument having psychometric properties showing good model fit. Twenty items were retained for the resulting Workplace Resilience Instrument (WRI). Parallel analysis was conducted with successive iterations of exploratory and confirmatory factor analyses. Respondents were classified based on their employment with either a rural or an urban hospital. Executives had significantly higher WRI scores than nurses, controlling for gender. WRI scores were positively and significantly correlated with years of experience and the Brief Job Stress Questionnaire. An instrument to measure individual resilience in the workplace (WRI) was developed. The WRI's four factors identify dimensions of workplace resilience for use in subsequent investigations: Active Problem-Solving, Team Efficacy, Confident Sense-Making, and Bricolage.

  10. Diffusion theory model for optimization calculations of cold neutron sources

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1987-01-01

    Cold neutron sources are becoming increasingly important and common experimental facilities made available at many research reactors around the world due to the high utility of cold neutrons in scattering experiments. The authors describe a simple two-group diffusion model of an infinite slab LD 2 cold source. The simplicity of the model permits to obtain an analytical solution from which one can deduce the reason for the optimum thickness based solely on diffusion-type phenomena. Also, a second more sophisticated model is described and the results compared to a deterministic transport calculation. The good (particularly qualitative) agreement between the results suggests that diffusion theory methods can be used in parametric and optimization studies to avoid the generally more expensive transport calculations

  11. Lattice Gauge Theories Within and Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Gelzer, Zechariah John [Iowa U.

    2017-01-01

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \

  12. A queueing theory based model for business continuity in hospitals.

    Science.gov (United States)

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

    2013-01-01

    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.

  13. Density Functional Theory and Materials Modeling at Atomistic Length Scales

    Directory of Open Access Journals (Sweden)

    Swapan K. Ghosh

    2002-04-01

    Full Text Available Abstract: We discuss the basic concepts of density functional theory (DFT as applied to materials modeling in the microscopic, mesoscopic and macroscopic length scales. The picture that emerges is that of a single unified framework for the study of both quantum and classical systems. While for quantum DFT, the central equation is a one-particle Schrodinger-like Kohn-Sham equation, the classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential, the exact functional form of which is unknown. One therefore approximates the exchange-correlation potential for quantum systems and the excess free energy density functional or the direct correlation functions for classical systems. Illustrative applications of quantum DFT to microscopic modeling of molecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are highlighted.

  14. Criticism of the Classical Theory of Macroeconomic Modeling

    Directory of Open Access Journals (Sweden)

    Konstantin K. Kumehov

    2015-01-01

    Full Text Available Abstract: Current approaches and methods of modeling of macroeconomic systems do not allow to generate research ideas that could be used in applications. This is largely due to the fact that the dominant economic schools and research directions are building their theories on misconceptions about the economic system as object modeling, and have no common methodological approaches in the design of macroeconomic models. All of them are focused on building a model aimed at establishing equilibrium parameters of supply and demand, production and consumption. At the same time as the underlying factors are not considered resource potential and the needs of society in material and other benefits. In addition, there is no unity in the choice of elements and mechanisms of interaction between them. Not installed, what are the criteria to determine the elements of the model: whether it is the institutions, whether the industry is whether the population, or banks, or classes, etc. From the methodological point of view, the design of the model all the most well-known authors extrapolated to the new models of the past state or past events. As a result, every time the model is ready by the time the situation changes, the last parameters underlying the model are losing relevance, so at best, the researcher may have to interpret the events and parameters that are not feasible in the future. In this paper, based on analysis of the works of famous authors, belonging to different schools and areas revealed weaknesses of their proposed macroeconomic models that do not allow you to use them to solve applied problems of economic development. A fundamentally new approaches and methods by which it is possible the construction of macroeconomic models that take into account the theoretical and applied aspects of modeling, as well as formulated the basic methodological requirements.

  15. Modeling safety instrumented systems with MooN voting architectures addressing system reconfiguration for testing

    International Nuclear Information System (INIS)

    Torres-Echeverria, A.C.; Martorell, S.; Thompson, H.A.

    2011-01-01

    This paper addresses the modeling of probability of dangerous failure on demand and spurious trip rate of safety instrumented systems that include MooN voting redundancies in their architecture. MooN systems are a special case of k-out-of-n systems. The first part of the article is devoted to the development of a time-dependent probability of dangerous failure on demand model with capability of handling MooN systems. The model is able to model explicitly common cause failure and diagnostic coverage, as well as different test frequencies and strategies. It includes quantification of both detected and undetected failures, and puts emphasis on the quantification of common cause failure to the system probability of dangerous failure on demand as an additional component. In order to be able to accommodate changes in testing strategies, special treatment is devoted to the analysis of system reconfiguration (including common cause failure) during test of one of its components, what is then included in the model. Another model for spurious trip rate is also analyzed and extended under the same methodology in order to empower it with similar capabilities. These two models are powerful enough, but at the same time simple, to be suitable for handling of dependability measures in multi-objective optimization of both system design and test strategies for safety instrumented systems. The level of modeling detail considered permits compliance with the requirements of the standard IEC 61508. The two models are applied to brief case studies to demonstrate their effectiveness. The results obtained demonstrated that the first model is adequate to quantify time-dependent PFD of MooN systems during different system states (i.e. full operation, test and repair) and different MooN configurations, which values are averaged to obtain the PFD avg . Also, it was demonstrated that the second model is adequate to quantify STR including spurious trips induced by internal component failure and

  16. The Gaussian streaming model and convolution Lagrangian effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Vlah, Zvonimir [Stanford Institute for Theoretical Physics and Department of Physics, Stanford University, Stanford, CA 94306 (United States); Castorina, Emanuele; White, Martin, E-mail: zvlah@stanford.edu, E-mail: ecastorina@berkeley.edu, E-mail: mwhite@berkeley.edu [Department of Physics, University of California, Berkeley, CA 94720 (United States)

    2016-12-01

    We update the ingredients of the Gaussian streaming model (GSM) for the redshift-space clustering of biased tracers using the techniques of Lagrangian perturbation theory, effective field theory (EFT) and a generalized Lagrangian bias expansion. After relating the GSM to the cumulant expansion, we present new results for the real-space correlation function, mean pairwise velocity and pairwise velocity dispersion including counter terms from EFT and bias terms through third order in the linear density, its leading derivatives and its shear up to second order. We discuss the connection to the Gaussian peaks formalism. We compare the ingredients of the GSM to a suite of large N-body simulations, and show the performance of the theory on the low order multipoles of the redshift-space correlation function and power spectrum. We highlight the importance of a general biasing scheme, which we find to be as important as higher-order corrections due to non-linear evolution for the halos we consider on the scales of interest to us.

  17. Non local theory of excitations applied to the Hubbard model

    International Nuclear Information System (INIS)

    Kakehashi, Y; Nakamura, T; Fulde, P

    2010-01-01

    We propose a nonlocal theory of single-particle excitations. It is based on an off-diagonal effective medium and the projection operator method for treating the retarded Green function. The theory determines the nonlocal effective medium matrix elements by requiring that they are consistent with those of the self-energy of the Green function. This arrows for a description of long-range intersite correlations with high resolution in momentum space. Numerical study for the half-filled Hubbard model on the simple cubic lattice demonstrates that the theory is applicable to the strong correlation regime as well as the intermediate regime of Coulomb interaction strength. Furthermore the results show that nonlocal excitations cause sub-bands in the strong Coulomb interaction regime due to strong antiferromagnetic correlations, decrease the quasi-particle peak on the Fermi level with increasing Coulomb interaction, and shift the critical Coulomb interaction U C2 for the divergence of effective mass towards higher energies at least by a factor of two as compared with that in the single-site approximation.

  18. Multiagent model and mean field theory of complex auction dynamics

    Science.gov (United States)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng

    2015-09-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  19. Multiagent model and mean field theory of complex auction dynamics

    International Nuclear Information System (INIS)

    Chen, Qinghua; Wang, Yougui; Huang, Zi-Gang; Lai, Ying-Cheng

    2015-01-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena. (paper)

  20. Irreducible gauge theory of a consolidated Salam-Weinberg model

    International Nuclear Information System (INIS)

    Ne'eman, Y.

    1979-01-01

    The Salam-Weinberg model is derived by gauging an internal simple supergroup SU(2/1). The theory uniquely assigns the correct SU(2)sub(L) X U(1) eigenvalues for all leptons, fixes thetasub(W) = 30 0 , generates the W +- sub(sigma), Z 0 sub(sigma) and Asub(sigma) together with the Higgs-Goldstone Isub(L) = 1/2 scalar multiplets as gauge fields, and imposes the standard spontaneous breakdown of SU(2)sub(L) X U(1). The masses of intermediate bosons and fermions are directly generated by SU(2/1) universality, which also fixes the Higgs field coupling. (Auth.)

  1. Ferromagnetism in the Hubbard model: a modified perturbation theory

    International Nuclear Information System (INIS)

    Gangadhar Reddy, G.; Ramakanth, A.; Nolting, W.

    2005-01-01

    We study the possibility of ferromagnetism in the Hubbard model using the modified perturbation theory. In this approach an Ansatz is made for the self-energy of the electron which contains the second order contribution developed around the Hartree-Fock solution and two parameters. The parameters are fixed by using a moment method. This self energy satisfies several known exact limiting cases. Using this self energy, the Curie temperature T c as a function of band filling n is investigated. It is found that T c falls off abruptly as n approaches half filling. The results are in qualitative agreement with earlier calculations using other approximation schemes. (author)

  2. Mean-field theory and self-consistent dynamo modeling

    International Nuclear Information System (INIS)

    Yoshizawa, Akira; Yokoi, Nobumitsu

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  3. Morphing the Shell Model into an Effective Theory

    International Nuclear Information System (INIS)

    Haxton, W. C.; Song, C.-L.

    2000-01-01

    We describe a strategy for attacking the canonical nuclear structure problem--bound-state properties of a system of point nucleons interacting via a two-body potential--which involves an expansion in the number of particles scattering at high momenta, but is otherwise exact. The required self-consistent solutions of the Bloch-Horowitz equation for effective interactions and operators are obtained by an efficient Green's function method based on the Lanczos algorithm. We carry out this program for the simplest nuclei, d and 3 He , in order to explore the consequences of reformulating the shell model as a controlled effective theory. (c) 2000 The American Physical Society

  4. Lagrangian model of conformal invariant interacting quantum field theory

    International Nuclear Information System (INIS)

    Lukierski, J.

    1976-01-01

    A Lagrangian model of conformal invariant interacting quantum field theory is presented. The interacting Lagrangian and free Lagrangian are derived replacing the canonical field phi by the field operator PHIsub(d)sup(c) and introducing the conformal-invariant interaction Lagrangian. It is suggested that in the conformal-invariant QFT with the dimensionality αsub(B) obtained from the bootstrep equation, the normalization constant c of the propagator and the coupling parametery do not necessarily need to satisfy the relation xsub(B) = phi 2 c 3

  5. New Trends in Model Coupling Theory, Numerics and Applications

    International Nuclear Information System (INIS)

    Coquel, F.; Godlewski, E.; Herard, J. M.; Segre, J.

    2010-01-01

    This special issue comprises selected papers from the workshop New Trends in Model Coupling, Theory, Numerics and Applications (NTMC'09) which took place in Paris, September 2 - 4, 2009. The research of optimal technological solutions in a large amount of industrial systems requires to perform numerical simulations of complex phenomena which are often characterized by the coupling of models related to various space and/or time scales. Thus, the so-called multi-scale modelling has been a thriving scientific activity which connects applied mathematics and other disciplines such as physics, chemistry, biology or even social sciences. To illustrate the variety of fields concerned by the natural occurrence of model coupling we may quote: meteorology where it is required to take into account several turbulence scales or the interaction between oceans and atmosphere, but also regional models in a global description, solid mechanics where a thorough understanding of complex phenomena such as propagation of cracks needs to couple various models from the atomistic level to the macroscopic level; plasma physics for fusion energy for instance where dense plasmas and collisionless plasma coexist; multiphase fluid dynamics when several types of flow corresponding to several types of models are present simultaneously in complex circuits; social behaviour analysis with interaction between individual actions and collective behaviour. (authors)

  6. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  7. Corvid re-caching without 'theory of mind': a model.

    Science.gov (United States)

    van der Vaart, Elske; Verbrugge, Rineke; Hemelrijk, Charlotte K

    2012-01-01

    Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.

  8. Corvid re-caching without 'theory of mind': a model.

    Directory of Open Access Journals (Sweden)

    Elske van der Vaart

    Full Text Available Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.

  9. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Directory of Open Access Journals (Sweden)

    Ryo Oizumi

    Full Text Available Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  10. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Science.gov (United States)

    Oizumi, Ryo; Kuniya, Toshikazu; Enatsu, Yoichi

    2016-01-01

    Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  11. A model of the demand for Islamic banks debt-based financing instrument

    Science.gov (United States)

    Jusoh, Mansor; Khalid, Norlin

    2013-04-01

    This paper presents a theoretical analysis of the demand for debt-based financing instruments of the Islamic banks. Debt-based financing, such as through baibithamanajil and al-murabahah, is by far the most prominent of the Islamic bank financing and yet it has been largely ignored in Islamic economics literature. Most studies instead have been focusing on equity-based financing of al-mudharabah and al-musyarakah. Islamic bank offers debt-based financing through various instruments derived under the principle of exchange (ukud al-mu'awadhat) or more specifically, the contract of deferred sale. Under such arrangement, Islamic debt is created when goods are purchased and the payments are deferred. Thus, unlike debt of the conventional bank which is a form of financial loan contract to facilitate demand for liquid assets, this Islamic debt is created in response to the demand to purchase goods by deferred payment. In this paper we set an analytical framework that is based on an infinitely lived representative agent model (ILRA model) to analyze the demand for goods to be purchased by deferred payment. The resulting demand will then be used to derive the demand for Islamic debt. We also investigate theoretically, factors that may have an impact on the demand for Islamic debt.

  12. Instrumentation development

    International Nuclear Information System (INIS)

    Ubbes, W.F.; Yow, J.L. Jr.

    1988-01-01

    Instrumentation is developed for the Civilian Radioactive Waste Management Program to meet several different (and sometimes conflicting) objectives. This paper addresses instrumentation development for data needs that are related either directly or indirectly to a repository site, but does not touch on instrumentation for work with waste forms or other materials. Consequently, this implies a relatively large scale for the measurements, and an in situ setting for instrument performance. In this context, instruments are needed for site characterization to define phenomena, develop models, and obtain parameter values, and for later design and performance confirmation testing in the constructed repository. The former set of applications is more immediate, and is driven by the needs of program design and performance assessment activities. A host of general technical and nontechnical issues have arisen to challenge instrumentation development. Instruments can be classed into geomechanical, geohydrologic, or other specialty categories, but these issues cut across artificial classifications. These issues are outlined. Despite this imposing list of issues, several case histories are cited to evaluate progress in the area

  13. Modeling the Pineapple Express phenomenon via Multivariate Extreme Value Theory

    Science.gov (United States)

    Weller, G.; Cooley, D. S.

    2011-12-01

    The pineapple express (PE) phenomenon is responsible for producing extreme winter precipitation events in the coastal and mountainous regions of the western United States. Because the PE phenomenon is also associated with warm temperatures, the heavy precipitation and associated snowmelt can cause destructive flooding. In order to study impacts, it is important that regional climate models from NARCCAP are able to reproduce extreme precipitation events produced by PE. We define a daily precipitation quantity which captures the spatial extent and intensity of precipitation events produced by the PE phenomenon. We then use statistical extreme value theory to model the tail dependence of this quantity as seen in an observational data set and each of the six NARCCAP regional models driven by NCEP reanalysis. We find that most NCEP-driven NARCCAP models do exhibit tail dependence between daily model output and observations. Furthermore, we find that not all extreme precipitation events are pineapple express events, as identified by Dettinger et al. (2011). The synoptic-scale atmospheric processes that drive extreme precipitation events produced by PE have only recently begun to be examined. Much of the current work has focused on pattern recognition, rather than quantitative analysis. We use daily mean sea-level pressure (MSLP) fields from NCEP to develop a "pineapple express index" for extreme precipitation, which exhibits tail dependence with our observed precipitation quantity for pineapple express events. We build a statistical model that connects daily precipitation output from the WRFG model, daily MSLP fields from NCEP, and daily observed precipitation in the western US. Finally, we use this model to simulate future observed precipitation based on WRFG output driven by the CCSM model, and our pineapple express index derived from future CCSM output. Our aim is to use this model to develop a better understanding of the frequency and intensity of extreme

  14. Perception and Modeling of Affective Qualities of Musical Instrument Sounds across Pitch Registers.

    Science.gov (United States)

    McAdams, Stephen; Douglas, Chelsea; Vempala, Naresh N

    2017-01-01

    Composers often pick specific instruments to convey a given emotional tone in their music, partly due to their expressive possibilities, but also due to their timbres in specific registers and at given dynamic markings. Of interest to both music psychology and music informatics from a computational point of view is the relation between the acoustic properties that give rise to the timbre at a given pitch and the perceived emotional quality of the tone. Musician and nonmusician listeners were presented with 137 tones produced at a fixed dynamic marking (forte) playing tones at pitch class D# across each instrument's entire pitch range and with different playing techniques for standard orchestral instruments drawn from the brass, woodwind, string, and pitched percussion families. They rated each tone on six analogical-categorical scales in terms of emotional valence (positive/negative and pleasant/unpleasant), energy arousal (awake/tired), tension arousal (excited/calm), preference (like/dislike), and familiarity. Linear mixed models revealed interactive effects of musical training, instrument family, and pitch register, with non-linear relations between pitch register and several dependent variables. Twenty-three audio descriptors from the Timbre Toolbox were computed for each sound and analyzed in two ways: linear partial least squares regression (PLSR) and nonlinear artificial neural net modeling. These two analyses converged in terms of the importance of various spectral, temporal, and spectrotemporal audio descriptors in explaining the emotion ratings, but some differences also emerged. Different combinations of audio descriptors make major contributions to the three emotion dimensions, suggesting that they are carried by distinct acoustic properties. Valence is more positive with lower spectral slopes, a greater emergence of strong partials, and an amplitude envelope with a sharper attack and earlier decay. Higher tension arousal is carried by brighter sounds

  15. Model of the lines of sight for an off-axis optical instrument Pleiades

    Science.gov (United States)

    Sauvage, Dominique; Gaudin-Delrieu, Catherine; Tournier, Thierry

    2017-11-01

    The future Earth observation missions aim at delivering images with a high resolution and a large field of view. These images have to be processed to get a very accurate localisation. In that goal, the individual lines of sight of each photosensitive element must be evaluated according to the localisation of the pixels in the focal plane. But, with off-axis Korsch telescope (like PLEIADES), the classical model has to be adapted. This is possible by using optical ground measurements made after the integration of the instrument. The processing of these results leads to several parameters, which are function of the offsets of the focal plane and the real focal length. All this study which has been proposed for the PLEIADES mission leads to a more elaborated model which provides the relation between the lines of sight and the location of the pixels, with a very good accuracy, close to the pixel size.

  16. MODELLING THE FUTURE MUSIC TEACHERS’ READINESS TO PERFORMING AND INTERPRETIVE ACTIVITY DURING INSTRUMENTAL TRAINING

    Directory of Open Access Journals (Sweden)

    Chenj Bo

    2016-11-01

    Full Text Available One of the main fields of training future music teachers in Ukrainian system of higher education is instrumental music one, such as skills of performing and interpretive activities. The aim of the article is to design a model of the future music teachers’ readiness to performing and interpretive activities in musical and instrumental training. The process of modelling is based on several interrelated scientific approaches, including systemic, personality-centered, reflective, competence, active and creative ones. While designing a model of music future teachers’ readinesses to musical interpretive activities, its philosophical, informative, interactive, hedonistic, creative functions are taken into account. Important theoretical and methodological factors are thought to be principles of musical and pedagogical education: culture correspondence and reflection; unity of emotional and conscious, artistic and technical items in musical education; purposeful interrelations and art and pedagogical communication between teachers and students; intensification of music and creative activity. Above-mentioned pedagogical phenomenon is subdivided into four components: motivation-oriented, cognitive-evaluating, performance-independent, creative and productive. For each component relevant criteria and indicators are identified. The stages of future music teachers’ readiness to performing interpretative activity are highlighted: information searching one, which contributes to the implementation of complex diagnostic methods (surveys, questionnaires, testing; regulative and performing one, which is characterized by future music teachers’ immersion into music performing and interpretative activities; operational and reflective stage, which involves activation of mechanisms of future music teachers’ self-knowledge, self-realization, formation of skills of independent artistic and expressive various music genres and styles interpretation; projective and

  17. Tissue Acoustoelectric Effect Modeling From Solid Mechanics Theory.

    Science.gov (United States)

    Song, Xizi; Qin, Yexian; Xu, Yanbin; Ingram, Pier; Witte, Russell S; Dong, Feng

    2017-10-01

    The acoustoelectric (AE) effect is a basic physical phenomenon, which underlies the changes made in the conductivity of a medium by the application of focused ultrasound. Recently, based on the AE effect, several biomedical imaging techniques have been widely studied, such as ultrasound-modulated electrical impedance tomography and ultrasound current source density imaging. To further investigate the mechanism of the AE effect in tissue and to provide guidance for such techniques, we have modeled the tissue AE effect using the theory of solid mechanics. Both bulk compression and thermal expansion of tissue are considered and discussed. Computation simulation shows that the muscle AE effect result, conductivity change rate, is 3.26×10 -3 with 4.3-MPa peak pressure, satisfying the theoretical value. Bulk compression plays the main role for muscle AE effect, while thermal expansion makes almost no contribution to it. In addition, the AE signals of porcine muscle are measured at different focal positions. With the same magnitude order and the same change trend, the experiment result confirms that the simulation result is effective. Both simulation and experimental results validate that tissue AE effect modeling using solid mechanics theory is feasible, which is of significance for the further development of related biomedical imaging techniques.

  18. An Evolutionary Game Theory Model of Spontaneous Brain Functioning.

    Science.gov (United States)

    Madeo, Dario; Talarico, Agostino; Pascual-Leone, Alvaro; Mocenni, Chiara; Santarnecchi, Emiliano

    2017-11-22

    Our brain is a complex system of interconnected regions spontaneously organized into distinct networks. The integration of information between and within these networks is a continuous process that can be observed even when the brain is at rest, i.e. not engaged in any particular task. Moreover, such spontaneous dynamics show predictive value over individual cognitive profile and constitute a potential marker in neurological and psychiatric conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN), able to capture brain's interregional dynamics by balancing emulative and non-emulative attitudes among brain regions. This results in the net behavior of nodes composing resting-state networks identified using functional magnetic resonance imaging (fMRI), determining their moment-to-moment level of activation and inhibition as expressed by positive and negative shifts in BOLD fMRI signal. By spontaneously generating low-frequency oscillatory behaviors, the EGN model is able to mimic functional connectivity dynamics, approximate fMRI time series on the basis of initial subset of available data, as well as simulate the impact of network lesions and provide evidence of compensation mechanisms across networks. Results suggest evolutionary game theory on networks as a new potential framework for the understanding of human brain network dynamics.

  19. Advancing Models and Theories for Digital Behavior Change Interventions.

    Science.gov (United States)

    Hekler, Eric B; Michie, Susan; Pavel, Misha; Rivera, Daniel E; Collins, Linda M; Jimison, Holly B; Garnett, Claire; Parral, Skye; Spruijt-Metz, Donna

    2016-11-01

    To be suitable for informing digital behavior change interventions, theories and models of behavior change need to capture individual variation and changes over time. The aim of this paper is to provide recommendations for development of models and theories that are informed by, and can inform, digital behavior change interventions based on discussions by international experts, including behavioral, computer, and health scientists and engineers. The proposed framework stipulates the use of a state-space representation to define when, where, for whom, and in what state for that person, an intervention will produce a targeted effect. The "state" is that of the individual based on multiple variables that define the "space" when a mechanism of action may produce the effect. A state-space representation can be used to help guide theorizing and identify crossdisciplinary methodologic strategies for improving measurement, experimental design, and analysis that can feasibly match the complexity of real-world behavior change via digital behavior change interventions. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  20. Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.

    Science.gov (United States)

    Merrick, Jason R W; Leclerc, Philip

    2016-04-01

    Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. © 2014 Society for Risk Analysis.

  1. SIMP model at NNLO in chiral perturbation theory

    Science.gov (United States)

    Hansen, Martin; Langæble, Kasper; Sannino, Francesco

    2015-10-01

    We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 →2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles. By performing a consistent next-to-leading- and next-to-next-to-leading-order chiral perturbative investigation we demonstrate that the leading-order analysis cannot be used to draw conclusions about the viability of the model. We further show that higher-order corrections substantially increase the tension with phenomenological constraints challenging the viability of the simplest realization of the strongly interacting massive particle paradigm.

  2. Signal classification using global dynamical models, Part I: Theory

    International Nuclear Information System (INIS)

    Kadtke, J.; Kremliovsky, M.

    1996-01-01

    Detection and classification of signals is one of the principal areas of signal processing, and the utilization of nonlinear information has long been considered as a way of improving performance beyond standard linear (e.g. spectral) techniques. Here, we develop a method for using global models of chaotic dynamical systems theory to define a signal classification processing chain, which is sensitive to nonlinear correlations in the data. We use it to demonstrate classification in high noise regimes (negative SNR), and argue that classification probabilities can be directly computed from ensemble statistics in the model coefficient space. We also develop a modification for non-stationary signals (i.e. transients) using non-autonomous ODEs. In Part II of this paper, we demonstrate the analysis on actual open ocean acoustic data from marine biologics. copyright 1996 American Institute of Physics

  3. Local models of Gauge Mediated Supersymmetry Breaking in String Theory

    CERN Document Server

    Garcia-Etxebarria, I; Uranga, Angel M; Garcia-Etxebarria, Inaki; Saad, Fouad; Uranga, Angel M.

    2006-01-01

    We describe local Calabi-Yau geometries with two isolated singularities at which systems of D3- and D7-branes are located, leading to chiral sectors corresponding to a semi-realistic visible sector and a hidden sector with dynamical supersymmetry breaking. We provide explicit models with a 3-family MSSM-like visible sector, and a hidden sector breaking supersymmetry at a meta-stable minimum. For singularities separated by a distance smaller than the string scale, this construction leads to a simple realization of gauge mediated supersymmetry breaking in string theory. The models are simple enough to allow the explicit computation of the massive messenger sector, using dimer techniques for branes at singularities. The local character of the configurations makes manifest the UV insensitivity of the supersymmetry breaking mediation.

  4. H+3 WZNW model from Liouville field theory

    International Nuclear Information System (INIS)

    Hikida, Y.; Schomerus, V.

    2007-06-01

    There exists an intriguing relation between genus zero correlation functions in the H + 3 WZNW model and in Liouville field theory. This was found by Ribault and Teschner based in part on earlier ideas by Stoyanovsky. We provide a path integral derivation of the correspondence and then use our new approach to generalize the relation to surfaces of arbitrary genus g. In particular we determine the correlation functions of N primary fields in the WZNW model explicitly through Liouville correlators with N+2g-2 additional insertions of certain degenerate fields. The paper concludes with a list of interesting further extensions and a few comments on the relation to the geometric Langlands program. (orig.)

  5. Describing a Strongly Correlated Model System with Density Functional Theory.

    Science.gov (United States)

    Kong, Jing; Proynov, Emil; Yu, Jianguo; Pachter, Ruth

    2017-07-06

    The linear chain of hydrogen atoms, a basic prototype for the transition from a metal to Mott insulator, is studied with a recent density functional theory model functional for nondynamic and strong correlation. The computed cohesive energy curve for the transition agrees well with accurate literature results. The variation of the electronic structure in this transition is characterized with a density functional descriptor that yields the atomic population of effectively localized electrons. These new methods are also applied to the study of the Peierls dimerization of the stretched even-spaced Mott insulator to a chain of H 2 molecules, a different insulator. The transitions among the two insulating states and the metallic state of the hydrogen chain system are depicted in a semiquantitative phase diagram. Overall, we demonstrate the capability of studying strongly correlated materials with a mean-field model at the fundamental level, in contrast to the general pessimistic view on such a feasibility.

  6. Lorentz-violating theories in the standard model extension

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira Junior, Manoel Messias [Universidade Federal do Maranhao (UFMA), Sao Luis, MA (Brazil)

    2012-07-01

    Full text: Lorentz-violating theories have been an issue of permanent interest in the latest years. Many of these investigations are developed under the theoretical framework of the Standard Model Extension (SME), a broad extension of the minimal Standard Model embracing Lorentz-violating (LV) terms, generated as vacuum expectation values of tensor quantities, in all sectors of interaction. In this talk, we comment on some general properties of the SME, concerning mainly the gauge and fermion sectors, focusing in new phenomena induced by Lorentz violation. The LV terms are usually separated in accordance with the behavior under discrete symmetries, being classified as CPT-odd or CPT-even, parity-even or parity-odd. We follow this classification scheme discussing some features and new properties of the CPT-even and CPT-odd parts of the gauge and fermion sectors. We finalize presenting some upper bounds imposed on the corresponding LV coefficients. (author)

  7. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    May R. D.

    2011-01-01

    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  8. HRM Model in Tourism, Based on Dialectical Systems Theory

    Directory of Open Access Journals (Sweden)

    Simona Šarotar Žižek

    2015-12-01

    Full Text Available A human resources management (HRM model integrating trends in HRM with trends in tourism into a dialectical system by the Dialectical Systems Theory (DST. HRM strategy, integrated within the tourism organization’s (to’s strategy is implemented through functional strategies helping their users to achieve a requisitely holistic (rh HRM strategy replacing the prevailing one-sided ones. to’s strategy covers: employees (1 planning, (2 acquisition and selection, (3 development and training, (4 diversity management, (5 teamwork and creativity, (6 motivation and rewarding, (7 stress reduction and health, (8 relationships, (9 personal holism, (10 well-being, (11 work and results assessment; etc. Everyone matters; their synergy is crucial. An innovated HRM model for TOS, which applies employees’, organizations’ rh and integrates new knowledge about HRM. HRM belongs to central managers’ tools. Their HRM must be adapted for TOS, where employees are crucial.

  9. Algebraic structure of cohomological field theory models and equivariant cohomology

    International Nuclear Information System (INIS)

    Stora, R.; Thuillier, F.; Wallet, J.Ch.

    1994-01-01

    The definition of observables within conventional gauge theories is settled by general consensus. Within cohomological theories considered as gauge theories of an exotic type, that question has a much less obvious answer. It is shown here that in most cases these theories are best defined in terms of equivariant cohomologies both at the field level and at the level of observables. (author). 21 refs

  10. An Education for Peace Model That Centres on Belief Systems: The Theory behind The Model

    Science.gov (United States)

    Willis, Alison

    2017-01-01

    The education for peace model (EFPM) presented in this paper was developed within a theoretical framework of complexity science and critical theory and was derived from a review of an empirical research project conducted in a conflict affected environment. The model positions belief systems at the centre and is socioecologically systemic in design…

  11. How to use the Standard Model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Henning, Brian; Lu, Xiaochuan [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Murayama, Hitoshi [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Kavli Institute for the Physics and Mathematics of the Universe (WPI),Todai Institutes for Advanced Study, University of Tokyo,Kashiwa 277-8583 (Japan)

    2016-01-05

    We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.

  12. Theory and modeling of cylindrical thermo-acoustic transduction

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Lihong, E-mail: lhtong@ecjtu.edu.cn [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China); Lim, C.W. [Department of Architecture and Civil Engineering, City University of Hong Kong, Kowloon, Hong Kong SAR (China); Zhao, Xiushao; Geng, Daxing [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China)

    2016-06-03

    Models both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed and the corresponding acoustic pressure solutions are obtained. The acoustic pressure for an individual carbon nanotube (CNT) as a function of input power is investigated analytically and it is verified by comparing with the published experimental data. Further numerical analysis on the acoustic pressure response and characteristics for varying input frequency and distance are also examined both for solid and thinfilm-solid cylindrical thermo-acoustic transductions. Through detailed theoretical and numerical studies on the acoustic pressure solution for thinfilm-solid cylindrical transduction, it is concluded that a solid with smaller thermal conductivity favors to improve the acoustic performance. In general, the proposed models are applicable to a variety of cylindrical thermo-acoustic devices performing in different gaseous media. - Highlights: • Theory and modeling both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed. • The modeling is verified by comparing with the published experimental data. • Acoustic response characteristics of cylindrical thermo-acoustic transductions are predicted by the proposed model.

  13. A Systems Model of Parkinson's Disease Using Biochemical Systems Theory.

    Science.gov (United States)

    Sasidharakurup, Hemalatha; Melethadathil, Nidheesh; Nair, Bipin; Diwakar, Shyam

    2017-08-01

    Parkinson's disease (PD), a neurodegenerative disorder, affects millions of people and has gained attention because of its clinical roles affecting behaviors related to motor and nonmotor symptoms. Although studies on PD from various aspects are becoming popular, few rely on predictive systems modeling approaches. Using Biochemical Systems Theory (BST), this article attempts to model and characterize dopaminergic cell death and understand pathophysiology of progression of PD. PD pathways were modeled using stochastic differential equations incorporating law of mass action, and initial concentrations for the modeled proteins were obtained from literature. Simulations suggest that dopamine levels were reduced significantly due to an increase in dopaminergic quinones and 3,4-dihydroxyphenylacetaldehyde (DOPAL) relating to imbalances compared to control during PD progression. Associating to clinically observed PD-related cell death, simulations show abnormal parkin and reactive oxygen species levels with an increase in neurofibrillary tangles. While relating molecular mechanistic roles, the BST modeling helps predicting dopaminergic cell death processes involved in the progression of PD and provides a predictive understanding of neuronal dysfunction for translational neuroscience.

  14. Multiscale Multiphysics and Multidomain Models I: Basic Theory.

    Science.gov (United States)

    Wei, Guo-Wei

    2013-12-01

    This work extends our earlier two-domain formulation of a differential geometry based multiscale paradigm into a multidomain theory, which endows us the ability to simultaneously accommodate multiphysical descriptions of aqueous chemical, physical and biological systems, such as fuel cells, solar cells, nanofluidics, ion channels, viruses, RNA polymerases, molecular motors and large macromolecular complexes. The essential idea is to make use of the differential geometry theory of surfaces as a natural means to geometrically separate the macroscopic domain of solvent from the microscopic domain of solute, and dynamically couple continuum and discrete descriptions. Our main strategy is to construct energy functionals to put on an equal footing of multiphysics, including polar (i.e., electrostatic) solvation, nonpolar solvation, chemical potential, quantum mechanics, fluid mechanics, molecular mechanics, coarse grained dynamics and elastic dynamics. The variational principle is applied to the energy functionals to derive desirable governing equations, such as multidomain Laplace-Beltrami (LB) equations for macromolecular morphologies, multidomain Poisson-Boltzmann (PB) equation or Poisson equation for electrostatic potential, generalized Nernst-Planck (NP) equations for the dynamics of charged solvent species, generalized Navier-Stokes (NS) equation for fluid dynamics, generalized Newton's equations for molecular dynamics (MD) or coarse-grained dynamics and equation of motion for elastic dynamics. Unlike the classical PB equation, our PB equation is an integral-differential equation due to solvent-solute interactions. To illustrate the proposed formalism, we have explicitly constructed three models, a multidomain solvation model, a multidomain charge transport model and a multidomain chemo-electro-fluid-MD-elastic model. Each solute domain is equipped with distinct surface tension, pressure, dielectric function, and charge density distribution. In addition to long

  15. Models and theories of prescribing decisions: A review and suggested a new model.

    Science.gov (United States)

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  16. MODELING THERMAL DUST EMISSION WITH TWO COMPONENTS: APPLICATION TO THE PLANCK HIGH FREQUENCY INSTRUMENT MAPS

    International Nuclear Information System (INIS)

    Meisner, Aaron M.; Finkbeiner, Douglas P.

    2015-01-01

    We apply the Finkbeiner et al. two-component thermal dust emission model to the Planck High Frequency Instrument maps. This parameterization of the far-infrared dust spectrum as the sum of two modified blackbodies (MBBs) serves as an important alternative to the commonly adopted single-MBB dust emission model. Analyzing the joint Planck/DIRBE dust spectrum, we show that two-component models provide a better fit to the 100-3000 GHz emission than do single-MBB models, though by a lesser margin than found by Finkbeiner et al. based on FIRAS and DIRBE. We also derive full-sky 6.'1 resolution maps of dust optical depth and temperature by fitting the two-component model to Planck 217-857 GHz along with DIRBE/IRAS 100 μm data. Because our two-component model matches the dust spectrum near its peak, accounts for the spectrum's flattening at millimeter wavelengths, and specifies dust temperature at 6.'1 FWHM, our model provides reliable, high-resolution thermal dust emission foreground predictions from 100 to 3000 GHz. We find that, in diffuse sky regions, our two-component 100-217 GHz predictions are on average accurate to within 2.2%, while extrapolating the Planck Collaboration et al. single-MBB model systematically underpredicts emission by 18.8% at 100 GHz, 12.6% at 143 GHz, and 7.9% at 217 GHz. We calibrate our two-component optical depth to reddening, and compare with reddening estimates based on stellar spectra. We find the dominant systematic problems in our temperature/reddening maps to be zodiacal light on large angular scales and the cosmic infrared background anisotropy on small angular scales

  17. Effective theory analysis for vector-like quark model

    Science.gov (United States)

    Morozumi, Takuya; Shimizu, Yusuke; Takahashi, Shunya; Umeeda, Hiroyuki

    2018-04-01

    We study a model with a down-type SU(2) singlet vector-like quark (VLQ) as a minimal extension of the standard model (SM). In this model, flavor-changing neutral currents (FCNCs) arise at tree level and the unitarity of the 3× 3 Cabibbo-Kobayashi-Maskawa (CKM) matrix does not hold. In this paper, we constrain the FCNC coupling from b\\rArr s transitions, especially B_s\\rArr μ^+μ^- and \\bar{B}\\rArr X_sγ processes. In order to analyze these processes we derive an effective Lagrangian that is valid below the electroweak symmetry breaking scale. For this purpose, we first integrate out the VLQ field and derive an effective theory by matching Wilson coefficients up to one-loop level. Using the effective theory, we construct the effective Lagrangian for b\\rArr sγ^{(*)}. It includes the effects of the SM quarks and the violation of CKM unitarity. We show the constraints on the magnitude of the FCNC coupling and its phase by taking account of the current experimental data on Δ M_{B_s}, Br[B_s\\rArrμ^+μ^-], Br[\\bar{B}\\rArr X_sγ], and CKM matrix elements, as well as theoretical uncertainties. We find that the constraint from Br[B_s\\rArrμ^+μ^-] is more stringent than that from Br[\\bar{B}\\rArr X_sγ]. We also obtain a bound for the mass of the VLQ and the strength of the Yukawa couplings related to the FCNC coupling of the b\\rArr s transition. Using the CKM elements that satisfy the above constraints, we show how the unitarity is violated on the complex plane.

  18. A Conceptual Model of Interpersonal Attraction (Centers' Instrumental Theory) Useful in Marriage and Family Counseling.

    Science.gov (United States)

    Kilgo, Reese D.

    Based upon Maslow's hierarchy of human needs, interpersonal attraction (any personal relationship characterized by love and affection; husband-wife, parent-child, friendship) can be seen as the mutual meeting of emotional needs, especially at the fourth level (love needs) and the fifth level (esteem needs). These levels are differentiated into 10…

  19. Use of a life-size three-dimensional-printed spine model for pedicle screw instrumentation training.

    Science.gov (United States)

    Park, Hyun Jin; Wang, Chenyu; Choi, Kyung Ho; Kim, Hyong Nyun

    2018-04-16

    Training beginners of the pedicle screw instrumentation technique in the operating room is limited because of issues related to patient safety and surgical efficiency. Three-dimensional (3D) printing enables training or simulation surgery on a real-size replica of deformed spine, which is difficult to perform in the usual cadaver or surrogate plastic models. The purpose of this study was to evaluate the educational effect of using a real-size 3D-printed spine model for training beginners of the free-hand pedicle screw instrumentation technique. We asked whether the use of a 3D spine model can improve (1) screw instrumentation accuracy and (2) length of procedure. Twenty life-size 3D-printed lumbar spine models were made from 10 volunteers (two models for each volunteer). Two novice surgeons who had no experience of free-hand pedicle screw instrumentation technique were instructed by an experienced surgeon, and each surgeon inserted 10 pedicle screws for each lumbar spine model. Computed tomography scans of the spine models were obtained to evaluate screw instrumentation accuracy. The length of time in completing the procedure was recorded. The results of the latter 10 spine models were compared with those of the former 10 models to evaluate learning effect. A total of 37/200 screws (18.5%) perforated the pedicle cortex with a mean of 1.7 mm (range, 1.2-3.3 mm). However, the latter half of the models had significantly less violation than the former half (10/100 vs. 27/100, p 3D-printed spine model can be an excellent tool for training beginners of the free-hand pedicle screw instrumentation.

  20. Lorentz Violation of the Photon Sector in Field Theory Models

    Directory of Open Access Journals (Sweden)

    Lingli Zhou

    2014-01-01

    Full Text Available We compare the Lorentz violation terms of the pure photon sector between two field theory models, namely, the minimal standard model extension (SME and the standard model supplement (SMS. From the requirement of the identity of the intersection for the two models, we find that the free photon sector of the SMS can be a subset of the photon sector of the minimal SME. We not only obtain some relations between the SME parameters but also get some constraints on the SMS parameters from the SME parameters. The CPT-odd coefficients (kAFα of the SME are predicted to be zero. There are 15 degrees of freedom in the Lorentz violation matrix Δαβ of free photons of the SMS related with the same number of degrees of freedom in the tensor coefficients (kFαβμν, which are independent from each other in the minimal SME but are interrelated in the intersection of the SMS and the minimal SME. With the related degrees of freedom, we obtain the conservative constraints (2σ on the elements of the photon Lorentz violation matrix. The detailed structure of the photon Lorentz violation matrix suggests some applications to the Lorentz violation experiments for photons.