WorldWideScience

Sample records for modeling evidence theory

  1. Compositional models and conditional independence in evidence theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim; Vejnarová, Jiřina

    2011-01-01

    Roč. 52, č. 3 (2011), s. 316-334 ISSN 0888-613X Institutional research plan: CEZ:AV0Z10750506 Keywords : Evidence theory * Conditional independence * multidimensional models Subject RIV: BA - General Mathematics Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-0370515.pdf

  2. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  3. Structural reliability analysis under evidence theory using the active learning kriging model

    Science.gov (United States)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  4. Critical evidence for the prediction error theory in associative learning.

    Science.gov (United States)

    Terao, Kanta; Matsumoto, Yukihisa; Mizunami, Makoto

    2015-03-10

    In associative learning in mammals, it is widely accepted that the discrepancy, or error, between actual and predicted reward determines whether learning occurs. Complete evidence for the prediction error theory, however, has not been obtained in any learning systems: Prediction error theory stems from the finding of a blocking phenomenon, but blocking can also be accounted for by other theories, such as the attentional theory. We demonstrated blocking in classical conditioning in crickets and obtained evidence to reject the attentional theory. To obtain further evidence supporting the prediction error theory and rejecting alternative theories, we constructed a neural model to match the prediction error theory, by modifying our previous model of learning in crickets, and we tested a prediction from the model: the model predicts that pharmacological intervention of octopaminergic transmission during appetitive conditioning impairs learning but not formation of reward prediction itself, and it thus predicts no learning in subsequent training. We observed such an "auto-blocking", which could be accounted for by the prediction error theory but not by other competitive theories to account for blocking. This study unambiguously demonstrates validity of the prediction error theory in associative learning.

  5. GARCH Option Valuation: Theory and Evidence

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Jacobs, Kris; Ornthanalai, Chayawat

    We survey the theory and empirical evidence on GARCH option valuation models. Our treatment includes the range of functional forms available for the volatility dynamic, multifactor models, nonnormal shock distributions as well as style of pricing kernels typically used. Various strategies...... for empirical implementation are laid out and we also discuss the links between GARCH and stochastic volatility models. In the appendix we provide Matlab computer code for option pricing via Monte Carlo simulation for nonaffine models as well as Fourier inversion for affine models....

  6. Theory- and Evidence- Based Intervention: Practice-Based Evidence--Integrating Positive Psychology into a Clinical Psychological Assessment and Intervention Model and How to Measure Outcome

    Science.gov (United States)

    Nissen, Poul

    2011-01-01

    In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence- based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning, which includes recognition of the influence of community, school, peers, family and the…

  7. Using the Dynamic Model to develop an evidence-based and theory-driven approach to school improvement

    NARCIS (Netherlands)

    Creemers, B.P.M.; Kyriakides, L.

    2010-01-01

    This paper refers to a dynamic perspective of educational effectiveness and improvement stressing the importance of using an evidence-based and theory-driven approach. Specifically, an approach to school improvement based on the dynamic model of educational effectiveness is offered. The recommended

  8. A review of the evidence linking adult attachment theory and chronic pain: presenting a conceptual model.

    Science.gov (United States)

    Meredith, Pamela; Ownsworth, Tamara; Strong, Jenny

    2008-03-01

    It is now well established that pain is a multidimensional phenomenon, affected by a gamut of psychosocial and biological variables. According to diathesis-stress models of chronic pain, some individuals are more vulnerable to developing disability following acute pain because they possess particular psychosocial vulnerabilities which interact with physical pathology to impact negatively upon outcome. Attachment theory, a theory of social and personality development, has been proposed as a comprehensive developmental model of pain, implicating individual adult attachment pattern in the ontogenesis and maintenance of chronic pain. The present paper reviews and critically appraises studies which link adult attachment theory with chronic pain. Together, these papers offer support for the role of insecure attachment as a diathesis (or vulnerability) for problematic adjustment to pain. The Attachment-Diathesis Model of Chronic Pain developed from this body of literature, combines adult attachment theory with the diathesis-stress approach to chronic pain. The evidence presented in this review, and the associated model, advances our understanding of the developmental origins of chronic pain conditions, with potential application in guiding early pain intervention and prevention efforts, as well as tailoring interventions to suit specific patient needs.

  9. Local computations in Dempster-Shafer theory of evidence

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2012-01-01

    Roč. 53, č. 8 (2012), s. 1155-1167 ISSN 0888-613X Grant - others:GA ČR(CZ) GAP403/12/2175 Program:GA Institutional support: RVO:67985556 Keywords : Discrete belief functions * Dempster-Shafer theory * conditional independence * decomposable model Subject RIV: IN - Informatics, Computer Science Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-local computations in dempster–shafer theory of evidence. pdf

  10. Stock portfolio selection using Dempster–Shafer evidence theory

    Directory of Open Access Journals (Sweden)

    Gour Sundar Mitra Thakur

    2018-04-01

    Full Text Available Markowitz’s return–risk model for stock portfolio selection is based on the historical return data of assets. In addition to the effect of historical return, there are many other critical factors which directly or indirectly influence the stock market. We use the fuzzy Delphi method to identify the critical factors initially. Factors having lower correlation coefficients are finally considered for further consideration. The critical factors and historical data are used to apply Dempster–Shafer evidence theory to rank the stocks. Then, a portfolio selection model that prefers stocks with higher rank is proposed. Illustration is done using stocks under Bombay Stock Exchange (BSE. Simulation is done by Ant Colony Optimization. The performance of the outcome is found satisfactory when compared with recent performance of the assets. Keywords: Stock portfolio selection, Ranking, Dempster–Shafer evidence theory, Ant Colony Optimization, Fuzzy Delphi method

  11. An Integrated Decision-Making Model for Transformer Condition Assessment Using Game Theory and Modified Evidence Combination Extended by D Numbers

    Directory of Open Access Journals (Sweden)

    Lingjie Sun

    2016-08-01

    Full Text Available The power transformer is one of the most critical and expensive components for the stable operation of the power system. Hence, how to obtain the health condition of transformer is of great importance for power utilities. Multi-attribute decision-making (MADM, due to its ability of solving multi-source information problems, has become a quite effective tool to evaluate the health condition of transformers. Currently, the analytic hierarchy process (AHP and Dempster–Shafer theory are two popular methods to solve MADM problems; however, these techniques rarely consider one-sidedness of the single weighting method and the exclusiveness hypothesis of the Dempster–Shafer theory. To overcome these limitations, this paper introduces a novel decision-making model, which integrates the merits of fuzzy set theory, game theory and modified evidence combination extended by D numbers, to evaluate the health condition of transformers. A four-level framework, which includes three factors and seventeen sub-factors, is put forward to facilitate the evaluation model. The model points out the following: First, the fuzzy set theory is employed to obtain the original basic probability assignments for all indices. Second, the subjective and objective weights of indices, which are calculated by fuzzy AHP and entropy weight, respectively, are integrated to generate the comprehensive weights based on game theory. Finally, based on the above two steps, the modified evidence combination extended by D numbers, which avoids the limitation of the exclusiveness hypothesis in the application of Dempster–Shafer theory, is proposed to obtain the final assessment results of transformers. Case studies are given to demonstrate the proposed modeling process. The results show the effectiveness and engineering practicability of the model in transformer condition assessment.

  12. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    Directory of Open Access Journals (Sweden)

    Kaijuan Yuan

    2016-01-01

    Full Text Available Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  13. A spatial Mankiw-Romer-Weil model: Theory and evidence

    OpenAIRE

    Fischer, Manfred M.

    2009-01-01

    This paper presents a theoretical growth model that extends the Mankiw-Romer-Weil [MRW] model by accounting for technological interdependence among regional economies. Interdependence is assumed to work through spatial externalities caused by disembodied knowledge diffusion. The transition from theory to econometrics leads to a reduced-form empirical spatial Durbin model specification that explains the variation in regional levels of per worker output at steady state. A system ...

  14. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  15. Book Review: Market Liquidity: Theory, Evidence, and Policy

    DEFF Research Database (Denmark)

    Boscan, Luis

    2014-01-01

    Review of: Market Liquidity: Theory, Evidence, and Policy / by Thierry Foucault, Marco Pagano and Ailsa Röell. Oxford University Press. April 2013.......Review of: Market Liquidity: Theory, Evidence, and Policy / by Thierry Foucault, Marco Pagano and Ailsa Röell. Oxford University Press. April 2013....

  16. Combat Risk and Pay: Theory and Some Evidence

    Science.gov (United States)

    2011-10-01

    1776) theory of compensating differences, and Rosen (1986) devised what has become the standard neoclassical economic theory relating wages to the...I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Paper P-4774 October 2011 Combat Risk and Pay: Theory and Some Evidence Curtis J. Simon...OCT 2011 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Combat Risk and Pay: Theory and Some Evidence 5a. CONTRACT NUMBER 5b

  17. A theory of evidence for undeclared nuclear activities

    International Nuclear Information System (INIS)

    King, J.L.

    1995-01-01

    The IAEA has recently explored techniques to augment and improve its existing safeguards information systems as part of Program 93 + 2 in order to address the detection of undeclared activities. Effective utilization of information on undeclared activities requires a formulation of the relationship between the information being gathered and the resulting safeguards assurance. The process of safeguards is represented as the gathering of evidence to provide assurance that no undeclared activities take place. It is shown that the analysis of this process can be represented by a theory grounded in the Dempster-Shafer theory of evidence and the concept of possibility. This paper presents the underlying evidence theory required to support a new information system tool for the analysis of information with respect to undeclared activities. The Dempster-Shafer theory serves as the calculus for the combination of diverse sources of evidence, and when applied to safeguards information, provides a basis for interpreting the result of safeguards indicators and measurements -- safeguards assurance

  18. Evidence Combination From an Evolutionary Game Theory Perspective.

    Science.gov (United States)

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2016-09-01

    Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.

  19. Evidence accumulation as a model for lexical selection.

    Science.gov (United States)

    Anders, R; Riès, S; van Maanen, L; Alario, F X

    2015-11-01

    We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Measuring uncertainty within the theory of evidence

    CERN Document Server

    Salicone, Simona

    2018-01-01

    This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...

  1. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of placing preventive fissure sealants.

    Science.gov (United States)

    Bonetti, Debbie; Johnston, Marie; Clarkson, Jan E; Grimshaw, Jeremy; Pitts, Nigel B; Eccles, Martin; Steen, Nick; Thomas, Ruth; Maclennan, Graeme; Glidewell, Liz; Walker, Anne

    2010-04-08

    Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs) in Scotland. Outcomes were behavioural simulation (scenario decision-making), and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model, and knowledge (a non-theoretical construct). Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value. Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT), timeline acute (CS-SRM), and outcome expectancy (SCT) entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT) and attitude (TPB) entered the equation, together explaining 68% of the variance in intention. The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for identifying factors that may predict clinical behaviour

  2. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of placing preventive fissure sealants

    Directory of Open Access Journals (Sweden)

    Maclennan Graeme

    2010-04-01

    Full Text Available Abstract Background Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Methods Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs in Scotland. Outcomes were behavioural simulation (scenario decision-making, and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model, and knowledge (a non-theoretical construct. Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value Results Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT, timeline acute (CS-SRM, and outcome expectancy (SCT entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT and attitude (TPB entered the equation, together explaining 68% of the variance in intention. Summary The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for

  3. Self Modeling: Expanding the Theories of Learning

    Science.gov (United States)

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  4. Computational mate choice: theory and empirical evidence.

    Science.gov (United States)

    Castellano, Sergio; Cadeddu, Giorgia; Cermelli, Paolo

    2012-06-01

    The present review is based on the thesis that mate choice results from information-processing mechanisms governed by computational rules and that, to understand how females choose their mates, we should identify which are the sources of information and how they are used to make decisions. We describe mate choice as a three-step computational process and for each step we present theories and review empirical evidence. The first step is a perceptual process. It describes the acquisition of evidence, that is, how females use multiple cues and signals to assign an attractiveness value to prospective mates (the preference function hypothesis). The second step is a decisional process. It describes the construction of the decision variable (DV), which integrates evidence (private information by direct assessment), priors (public information), and value (perceived utility) of prospective mates into a quantity that is used by a decision rule (DR) to produce a choice. We make the assumption that females are optimal Bayesian decision makers and we derive a formal model of DV that can explain the effects of preference functions, mate copying, social context, and females' state and condition on the patterns of mate choice. The third step of mating decision is a deliberative process that depends on the DRs. We identify two main categories of DRs (absolute and comparative rules), and review the normative models of mate sampling tactics associated to them. We highlight the limits of the normative approach and present a class of computational models (sequential-sampling models) that are based on the assumption that DVs accumulate noisy evidence over time until a decision threshold is reached. These models force us to rethink the dichotomy between comparative and absolute decision rules, between discrimination and recognition, and even between rational and irrational choice. Since they have a robust biological basis, we think they may represent a useful theoretical tool for

  5. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  6. Condition Evaluation of Storage Equipment Based on Improved D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Zhang Xiao-yu

    2017-01-01

    Full Text Available Assessment and prediction of the storage equipment’s condition is always a difficult aspect in PHM technology. The current Condition evaluation of equipment lacks of the state level, and a single test data can’t reflect the change of equipment’s state. To solve the problem, this paper proposes an evaluation method based on improved D-S evidence theory. Firstly, use analytic hierarchy process (AHP to establish a hierarchical structure model of equipment and divide the qualified state into 4 grades. Then respectively compare the test data with the last test value, historical test mean value and standard value. And the triangular fuzzy function to calculate the index membership degree, combined with D-S evidence theory to fuse information from multiple sources, to achieve such equipment real-time state assessment. Finally, the model is used to a servo mechanism. The result shows that this method has a good performance in condition evaluation for the storage equipment

  7. Five roles for using theory and evidence in the design and testing of behavior change interventions.

    Science.gov (United States)

    Bartholomew, L Kay; Mullen, Patricia Dolan

    2011-01-01

    The prevailing wisdom in the field of health-related behavior change is that well-designed and effective interventions are guided by theory. Using the framework of intervention mapping, we describe and provide examples of how investigators can effectively select and use theory to design, test, and report interventions. We propose five roles for theory and evidence about theories: a) identification of behavior and determinants of behavior related to a specified health problem (i.e., the logic model of the problem); b) explication of a causal model that includes theoretical constructs for producing change in the behavior of interest (i.e., the logic model of change); c) selection of intervention methods and delivery of practical applications to achieve changes in health behavior; d) evaluation of the resulting intervention including theoretical mediating variables; and e) reporting of the active ingredients of the intervention together with the evaluation results. In problem-driven applied behavioral or social science, researchers use one or multiple theories, empiric evidence, and new research, both to assess a problem and to solve or prevent a problem. Furthermore, the theories for description of the problem may differ from the theories for its solution. In an applied approach, the main focus is on solving problems regarding health behavior change and improvement of health outcomes, and the criteria for success are formulated in terms of the problem rather than the theory. Resulting contributions to theory development may be quite useful, but they are peripheral to the problem-solving process.

  8. Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions

    International Nuclear Information System (INIS)

    Shah, Harsheel; Hosder, Serhat; Winter, Tyler

    2015-01-01

    The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)

  9. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 1: Building a Foundation

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-06-01

    Full Text Available Objective – Part I of this paper aims to create a framework for an emerging theory of evidence based information literacy instruction. In order to ground this framework in existing theory, a holistic perspective views inquiry as a learning process that synthesizes information searching and knowledge building. An interdisciplinary approach is taken to relate user-centric information behavior theory and constructivist learning theory that supports this synthesis. The substantive theories that emerge serve as a springboard for emerging theory. A second objective of this paper is to define evidence based information literacy instruction by assessing the suitability of performance based assessment and action research as tools of evidence based practice.Methods – An historical review of research grounded in user-centered information behavior theory and constructivist learning theory establishes a body of existing substantive theory that supports emerging theory for evidence based information literacy instruction within an information-to-knowledge approach. A focused review of the literature presents supporting research for an evidence based pedagogy that is performance assessment based, i.e., information users are immersed in real-world tasks that include formative assessments. An analysis of the meaning of action research in terms of its purpose and methodology establishes its suitability for structuring an evidence based pedagogy. Supporting research tests a training model for school librarians and educators which integrates performance based assessment, as well as action research. Results – Findings of an historical analysis of information behavior theory and constructivist teaching practices, and a literature review that explores teaching models for evidence based information literacy instruction, point to two elements of evidence based information literacy instruction: the micro level of information searching behavior and the macro level of

  10. A Novel Evaluation Model for Hybrid Power System Based on Vague Set and Dempster-Shafer Evidence Theory

    Directory of Open Access Journals (Sweden)

    Dongxiao Niu

    2012-01-01

    Full Text Available Because clean energy and traditional energy have different advantages and disadvantages, it is of great significance to evaluate comprehensive benefits for hybrid power systems. Based on thorough analysis of important characters on hybrid power systems, an index system including security, economic benefit, environmental benefit, and social benefit is established in this paper. Due to advantages of processing abundant uncertain and fuzzy information, vague set is used to determine the decision matrix. Convert vague decision matrix to real one by vague combination ruleand determine uncertain degrees of different indexes by grey incidence analysis, then the mass functions of different comment set in different indexes are obtained. Information can be fused in accordance with Dempster-Shafer (D-S combination rule and the evaluation result is got by vague set and D-S evidence theory. A simulation of hybrid power system including thermal power, wind power, and photovoltaic power in China is provided to demonstrate the effectiveness and potential of the proposed design scheme. It can be clearly seen that the uncertainties in decision making can be dramatically decreased compared with existing methods in the literature. The actual implementation results illustrate that the proposed index system and evaluation model based on vague set and D-S evidence theory are effective and practical to evaluate comprehensive benefit of hybrid power system.

  11. Uncertainty quantification using evidence theory in multidisciplinary design optimization

    International Nuclear Information System (INIS)

    Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh

    2004-01-01

    Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems

  12. Intervention planning for a digital intervention for self-management of hypertension: a theory-, evidence- and person-based approach.

    Science.gov (United States)

    Band, Rebecca; Bradbury, Katherine; Morton, Katherine; May, Carl; Michie, Susan; Mair, Frances S; Murray, Elizabeth; McManus, Richard J; Little, Paul; Yardley, Lucy

    2017-02-23

    This paper describes the intervention planning process for the Home and Online Management and Evaluation of Blood Pressure (HOME BP), a digital intervention to promote hypertension self-management. It illustrates how a Person-Based Approach can be integrated with theory- and evidence-based approaches. The Person-Based Approach to intervention development emphasises the use of qualitative research to ensure that the intervention is acceptable, persuasive, engaging and easy to implement. Our intervention planning process comprised two parallel, integrated work streams, which combined theory-, evidence- and person-based elements. The first work stream involved collating evidence from a mixed methods feasibility study, a systematic review and a synthesis of qualitative research. This evidence was analysed to identify likely barriers and facilitators to uptake and implementation as well as design features that should be incorporated in the HOME BP intervention. The second work stream used three complementary approaches to theoretical modelling: developing brief guiding principles for intervention design, causal modelling to map behaviour change techniques in the intervention onto the Behaviour Change Wheel and Normalisation Process Theory frameworks, and developing a logic model. The different elements of our integrated approach to intervention planning yielded important, complementary insights into how to design the intervention to maximise acceptability and ease of implementation by both patients and health professionals. From the primary and secondary evidence, we identified key barriers to overcome (such as patient and health professional concerns about side effects of escalating medication) and effective intervention ingredients (such as providing in-person support for making healthy behaviour changes). Our guiding principles highlighted unique design features that could address these issues (such as online reassurance and procedures for managing concerns). Causal

  13. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  14. State of the evidence regarding behavior change theories and strategies in nutrition counseling to facilitate health and food behavior change.

    Science.gov (United States)

    Spahn, Joanne M; Reeves, Rebecca S; Keim, Kathryn S; Laquatra, Ida; Kellogg, Molly; Jortberg, Bonnie; Clark, Nicole A

    2010-06-01

    Behavior change theories and models, validated within the field of dietetics, offer systematic explanations for nutrition-related behavior change. They are integral to the nutrition care process, guiding nutrition assessment, intervention, and outcome evaluation. The American Dietetic Association Evidence Analysis Library Nutrition Counseling Workgroup conducted a systematic review of peer-reviewed literature related to behavior change theories and strategies used in nutrition counseling. Two hundred fourteen articles were reviewed between July 2007 and March 2008, and 87 studies met the inclusion criteria. The workgroup systematically evaluated these articles and formulated conclusion statements and grades based upon the available evidence. Strong evidence exists to support the use of a combination of behavioral theory and cognitive behavioral theory, the foundation for cognitive behavioral therapy (CBT), in facilitating modification of targeted dietary habits, weight, and cardiovascular and diabetes risk factors. Evidence is particularly strong in patients with type 2 diabetes receiving intensive, intermediate-duration (6 to 12 months) CBT, and long-term (>12 months duration) CBT targeting prevention or delay in onset of type 2 diabetes and hypertension. Few studies have assessed the application of the transtheoretical model on nutrition-related behavior change. Little research was available documenting the effectiveness of nutrition counseling utilizing social cognitive theory. Motivational interviewing was shown to be a highly effective counseling strategy, particularly when combined with CBT. Strong evidence substantiates the effectiveness of self-monitoring and meal replacements and/or structured meal plans. Compelling evidence exists to demonstrate that financial reward strategies are not effective. Goal setting, problem solving, and social support are effective strategies, but additional research is needed in more diverse populations. Routine documentation

  15. Applying psychological theories to evidence-based clinical practice: Identifying factors predictive of managing upper respiratory tract infections without antibiotics

    Directory of Open Access Journals (Sweden)

    Glidewell Elizabeth

    2007-08-01

    Full Text Available Abstract Background Psychological models can be used to understand and predict behaviour in a wide range of settings. However, they have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. The aim of this study was to explore the usefulness of a range of psychological theories to predict health professional behaviour relating to management of upper respiratory tract infections (URTIs without antibiotics. Methods Psychological measures were collected by postal questionnaire survey from a random sample of general practitioners (GPs in Scotland. The outcome measures were clinical behaviour (using antibiotic prescription rates as a proxy indicator, behavioural simulation (scenario-based decisions to managing URTI with or without antibiotics and behavioural intention (general intention to managing URTI without antibiotics. Explanatory variables were the constructs within the following theories: Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-Regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model (SM, and knowledge (a non-theoretical construct. For each outcome measure, multiple regression analysis was used to examine the predictive value of each theoretical model individually. Following this 'theory level' analysis, a 'cross theory' analysis was conducted to investigate the combined predictive value of all significant individual constructs across theories. Results All theories were tested, but only significant results are presented. When predicting behaviour, at the theory level, OLT explained 6% of the variance and, in a cross theory analysis, OLT 'evidence of habitual behaviour' also explained 6%. When predicting behavioural simulation, at the theory level, the proportion of variance explained was: TPB, 31%; SCT, 26%; II, 6%; OLT, 24%. GPs who reported having already decided to change their management to

  16. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  17. Taking Root: a grounded theory on evidence-based nursing implementation in China.

    Science.gov (United States)

    Cheng, L; Broome, M E; Feng, S; Hu, Y

    2018-06-01

    Evidence-based nursing is widely recognized as the critical foundation for quality care. To develop a middle-range theory on the process of evidence-based nursing implementation in Chinese context. A grounded theory study using unstructured in-depth individual interviews was conducted with 56 participants who were involved in 24 evidence-based nursing implementation projects in Mainland China from September 2015 to September 2016. A middle-range grounded theory of 'Taking Root' was developed. The theory describes the evidence implementation process consisting of four components (driving forces, process, outcome, sustainment/regression), three approaches (top-down, bottom-up and outside-in), four implementation strategies (patient-centred, nurses at the heart of change, reaching agreement, collaboration) and two patterns (transformational and adaptive implementation). Certain perspectives may have not been captured, as the retrospective nature of the interviewing technique did not allow for 'real-time' assessment of the actual implementation process. The transferability of the findings requires further exploration as few participants with negative experiences were recruited. This is the first study that explored evidence-based implementation process, strategies, approaches and patterns in the Chinese nursing practice context to inform international nursing and health policymaking. The theory of Taking Root described various approaches to evidence implementation and how the implementation can be transformational for the nurses and the setting in which they work. Nursing educators, managers and researchers should work together to improve nurses' readiness for evidence implementation. Healthcare systems need to optimize internal mechanisms and external collaborations to promote nursing practice in line with evidence and achieve clinical outcomes and sustainability. © 2017 International Council of Nurses.

  18. Faster-X evolution: Theory and evidence from Drosophila.

    Science.gov (United States)

    Charlesworth, Brian; Campos, José L; Jackson, Benjamin C

    2018-02-12

    A faster rate of adaptive evolution of X-linked genes compared with autosomal genes can be caused by the fixation of recessive or partially recessive advantageous mutations, due to the full expression of X-linked mutations in hemizygous males. Other processes, including recombination rate and mutation rate differences between X chromosomes and autosomes, may also cause faster evolution of X-linked genes. We review population genetics theory concerning the expected relative values of variability and rates of evolution of X-linked and autosomal DNA sequences. The theoretical predictions are compared with data from population genomic studies of several species of Drosophila. We conclude that there is evidence for adaptive faster-X evolution of several classes of functionally significant nucleotides. We also find evidence for potential differences in mutation rates between X-linked and autosomal genes, due to differences in mutational bias towards GC to AT mutations. Many aspects of the data are consistent with the male hemizygosity model, although not all possible confounding factors can be excluded. © 2018 John Wiley & Sons Ltd.

  19. Multi-Sensor Building Fire Alarm System with Information Fusion Technology Based on D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Qian Ding

    2014-10-01

    Full Text Available Multi-sensor and information fusion technology based on Dempster-Shafer evidence theory is applied in the system of a building fire alarm to realize early detecting and alarming. By using a multi-sensor to monitor the parameters of the fire process, such as light, smoke, temperature, gas and moisture, the range of fire monitoring in space and time is expanded compared with a single-sensor system. Then, the D-S evidence theory is applied to fuse the information from the multi-sensor with the specific fire model, and the fire alarm is more accurate and timely. The proposed method can avoid the failure of the monitoring data effectively, deal with the conflicting evidence from the multi-sensor robustly and improve the reliability of fire warning significantly.

  20. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  1. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  2. Collateral and the limits of debt capacity: theory and evidence

    NARCIS (Netherlands)

    Giambona, E.; Mello, A.S.; Riddiough, T.

    2012-01-01

    This paper considers how collateral is used to finance a going concern, and demonstrates with theory and evidence that there are effective limits to debt capacity and the kinds of claims that are issued to deploy that debt capacity. The theory shows that firms with (unobservably) better quality

  3. A Model for Evidence Accumulation in the Lexical Decision Task

    Science.gov (United States)

    Wagenmakers, Eric-Jan; Steyvers, Mark; Raaijmakers, Jeroen G. W.; Shiffrin, Richard M.; van Rijn, Hedderik; Zeelenberg, Rene

    2004-01-01

    We present a new model for lexical decision, REM-LD, that is based on REM theory (e.g., Shiffrin & Steyvers, 1997). REM-LD uses a principled (i.e., Bayes' rule) decision process that simultaneously considers the diagnosticity of the evidence for the 'WORD' response and the 'NONWORD' response. The model calculates the odds ratio that the presented…

  4. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  5. Organizational dimensions of relationship-centered care. Theory, evidence, and practice.

    Science.gov (United States)

    Safran, Dana Gelb; Miller, William; Beckman, Howard

    2006-01-01

    Four domains of relationship have been highlighted as the cornerstones of relationship-centered health care. Of these, clinician-patient relationships have been most thoroughly studied, with a rich empirical literature illuminating significant linkages between clinician-patient relationship quality and a wide range of outcomes. This paper explores the realm of clinician-colleague relationships, which we define to include the full array of relationships among clinicians, staff, and administrators in health care organizations. Building on a stream of relevant theories and empirical literature that have emerged over the past decade, we synthesize available evidence on the role of organizational culture and relationships in shaping outcomes, and posit a model of relationship-centered organizations. We conclude that turning attention to relationship-centered theory and practice in health care holds promise for advancing care to a new level, with breakthroughs in quality of care, quality of life for those who provide it, and organizational performance.

  6. Symmetry Breaking, Unification, and Theories Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasunori

    2009-07-31

    A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.

  7. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    Science.gov (United States)

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Children Balance Theories and Evidence in Exploration, Explanation, and Learning

    Science.gov (United States)

    Bonawitz, Elizabeth Baraff; van Schijndel, Tessa J. P.; Friel, Daniel; Schulz, Laura

    2012-01-01

    We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children, mean: 62 months). Center and Mass Theory children who…

  9. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz

    2017-01-01

    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  10. Assessing landslide susceptibility by applying fuzzy sets, possibility evidence-based theories

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-01-01

    Full Text Available A landslide susceptibility model was developed for the city of Manizales, Colombia; landslides have been the city’s main environmental problem. Fuzzy sets and possibility and evidence-based theories were used to construct the mo-del due to the set of circumstances and uncertainty involved in the modelling; uncertainty particularly concerned the lack of representative data and the need for systematically coordinating subjective information. Susceptibility and the uncertainty were estimated via data processing; the model contained data concerning mass vulnerability and uncer-tainty. Output data was expressed on a map defined by linguistic categories or uncertain labels as having low, me-dium, high and very high susceptibility; this was considered appropriate for representing susceptibility. A fuzzy spec-trum was developed for classifying susceptibility levels according to perception and expert opinion. The model sho-wed levels of susceptibility in the study area, ranging from low to high susceptibility (medium susceptibility being mo-re frequent. This article shows the details concerning systematic data processing by presenting theories and tools regarding uncertainty. The concept of fuzzy parameters is introduced; this is useful in modelling phenomena regar-ding uncertainty, complexity and nonlinear performance, showing that susceptibility modelling can be feasible. The paper also shows the great convenience of incorporating uncertainty into modelling and decision-making. However, quantifying susceptibility is not suitable when modelling identified uncertainty because incorporating model output information cannot be reduced into exact or real numerical quantities when the nature of the variables is particularly uncertain. The latter concept is applicable to risk assessment.

  11. How Often Is the Misfit of Item Response Theory Models Practically Significant?

    Science.gov (United States)

    Sinharay, Sandip; Haberman, Shelby J.

    2014-01-01

    Standard 3.9 of the Standards for Educational and Psychological Testing ([, 1999]) demands evidence of model fit when item response theory (IRT) models are employed to data from tests. Hambleton and Han ([Hambleton, R. K., 2005]) and Sinharay ([Sinharay, S., 2005]) recommended the assessment of practical significance of misfit of IRT models, but…

  12. Some empirical evidence for ecological dissonance theory.

    Science.gov (United States)

    Miller, D I; Verhoek-Miller, N; Giesen, J M; Wells-Parker, E

    2000-04-01

    Using Festinger's cognitive dissonance theory as a model, the extension to Barker's ecological theory, referred to as ecological dissonance theory, was developed. Designed to examine the motivational dynamics involved when environmental systems are in conflict with each other or with cognitive systems, ecological dissonance theory yielded five propositions which were tested in 10 studies. This summary of the studies suggests operationally defined measures of ecological dissonance may correlate with workers' satisfaction with their jobs, involvement with their jobs, alienation from their work, and to a lesser extent, workers' conflict resolution behavior and communication style.

  13. Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.

    Science.gov (United States)

    McPhail, Beverly A

    2016-07-01

    The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy. © The Author(s) 2015.

  14. Dark energy observational evidence and theoretical models

    CERN Document Server

    Novosyadlyj, B; Shtanov, Yu; Zhuk, A

    2013-01-01

    The book elucidates the current state of the dark energy problem and presents the results of the authors, who work in this area. It describes the observational evidence for the existence of dark energy, the methods and results of constraining of its parameters, modeling of dark energy by scalar fields, the space-times with extra spatial dimensions, especially Kaluza---Klein models, the braneworld models with a single extra dimension as well as the problems of positive definition of gravitational energy in General Relativity, energy conditions and consequences of their violation in the presence of dark energy. This monograph is intended for science professionals, educators and graduate students, specializing in general relativity, cosmology, field theory and particle physics.

  15. Advances in cognitive theory and therapy: the generic cognitive model.

    Science.gov (United States)

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  16. A Biblical-Theological Model of Cognitive Dissonance Theory: Relevance for Christian Educators

    Science.gov (United States)

    Bowen, Danny Ray

    2012-01-01

    The purpose of this content analysis research was to develop a biblical-theological model of Cognitive Dissonance Theory applicable to pedagogy. Evidence of cognitive dissonance found in Scripture was used to infer a purpose for the innate drive toward consonance. This inferred purpose was incorporated into a model that improves the descriptive…

  17. Theory, evidence and Intervention Mapping to improve behavior nutrition and physical activity interventions.

    NARCIS (Netherlands)

    J. Brug (Hans); A. Oenema (Anke); A. Ferreira (Isabel)

    2005-01-01

    textabstractBACKGROUND: The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. DISCUSSION: Since behavior theory is a reflection of the compiled evidence of behavior research, theory is

  18. TIM Series: Theory, Evidence and the Pragmatic Manager

    Directory of Open Access Journals (Sweden)

    Steven Muegge

    2008-08-01

    Full Text Available On July 2, 2008, Steven Muegge from Carleton University delivered a presentation entitled "Theory, Evidence and the Pragmatic Manager". This section provides the key messages from the lecture. The scope of this lecture spanned several topics, including management decision making, forecasting and its limitations, the psychology of expertise, and the management of innovation.

  19. From Theory to Practice: One Agency's Experience with Implementing an Evidence-Based Model.

    Science.gov (United States)

    Murray, Maureen; Culver, Tom; Farmer, Betsy; Jackson, Leslie Ann; Rixon, Brian

    2014-07-01

    As evidence-based practice is becoming integrated into children's mental health services as a means of improving outcomes for children and youth with severe behavioral and emotional problems, therapeutic foster care (TFC) which is a specialized treatment program for such youth, is one of few community-based programs considered to be evidence-based. "Together Facing the Challenge" (TFTC) which was developed as a component of a randomized trial of TFC has been identified as an evidence-based model. We describe the experiences reported by one of the agencies that participated in our study and how they have incorporated TFTC into their on-going practice. They highlight key implementation strategies, challenges faced, and lessons learned as they moved forward towards full implementation of TFTC throughout their agency.

  20. The Influence of Emotion on Fairness-Related Decision Making: A Critical Review of Theories and Evidence

    Directory of Open Access Journals (Sweden)

    Ya Zheng

    2017-09-01

    Full Text Available Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model. Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.

  1. The Influence of Emotion on Fairness-Related Decision Making: A Critical Review of Theories and Evidence

    Science.gov (United States)

    Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun

    2017-01-01

    Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm. PMID:28974937

  2. The Influence of Emotion on Fairness-Related Decision Making: A Critical Review of Theories and Evidence.

    Science.gov (United States)

    Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun

    2017-01-01

    Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.

  3. Little Hans and attachment theory: Bowlby's hypothesis reconsidered in light of new evidence from the Freud Archives.

    Science.gov (United States)

    Wakefield, Jerome C

    2007-01-01

    Bowlby (1973), applying attachment theory to Freud's case of Little Hans, hypothesized that Hans's anxiety was a manifestation of anxious attachment. However Bowlby's evidence was modest; Hans was threatened by his mother with abandonment, expressed fear of abandonment prior to symptom onset, and was separated from his mother for a short time a year before. Bowlby's hypothesis is reassessed in light of a systematic review of the case record as well as new evidence from recently derestricted interviews with Hans's father and Hans in the Freud Archives. Bowlby's hypothesis is supported by multiple additional lines of evidence regarding both triggers of separation anxiety preceding the phobia (e.g., a funeral, sibling rivalry, moving, getting his own bedroom) and background factors influencing his working model of attachment (mother's psychopathology, intense marital conflict, multiple suicides in mother's family) that would make him more vulnerable to such anxiety. Bowlby's hypothesis is also placed within the context of subsequent developments in attachment theory.

  4. Conceptual Foundations of Quantum Mechanics:. the Role of Evidence Theory, Quantum Sets, and Modal Logic

    Science.gov (United States)

    Resconi, Germano; Klir, George J.; Pessa, Eliano

    Recognizing that syntactic and semantic structures of classical logic are not sufficient to understand the meaning of quantum phenomena, we propose in this paper a new interpretation of quantum mechanics based on evidence theory. The connection between these two theories is obtained through a new language, quantum set theory, built on a suggestion by J. Bell. Further, we give a modal logic interpretation of quantum mechanics and quantum set theory by using Kripke's semantics of modal logic based on the concept of possible worlds. This is grounded on previous work of a number of researchers (Resconi, Klir, Harmanec) who showed how to represent evidence theory and other uncertainty theories in terms of modal logic. Moreover, we also propose a reformulation of the many-worlds interpretation of quantum mechanics in terms of Kripke's semantics. We thus show how three different theories — quantum mechanics, evidence theory, and modal logic — are interrelated. This opens, on one hand, the way to new applications of quantum mechanics within domains different from the traditional ones, and, on the other hand, the possibility of building new generalizations of quantum mechanics itself.

  5. Load theory behind the wheel: an experimental application of a cognitive model to simulated driving

    OpenAIRE

    Murphy, Gillian

    2017-01-01

    Load Theory is a prominent model of selective attention first proposed over twenty years ago. Load Theory is supported by a great many experimental and neuroimaging studies. There is however, little evidence that Load Theory can be applied to real world attention, though it has great practical potential. Driving, as an everyday task where failures of attention can have profound consequences, stands to benefit from the understanding of selective attention that Load Theory provides. The aim of ...

  6. Towards an understanding of the large-U Hubbard model and a theory for high-temperature superconductors

    International Nuclear Information System (INIS)

    Hsu, T.C.T.

    1989-01-01

    This thesis describes work on a large-U Hubbard model theory for high temperature superconductors. After an introduction to recent developments in the field, the author reviews experimental results. At the same time he introduces the holon-spinon model and comment on its successes and shortcomings. Using this heuristic model he then describes a holon pairing theory of superconductivity and list some experimental evidence for this interlayer coupling theory. The latter part of the thesis is devoted to projected fermion mean field theories. They are introduced by applying this theory and some recently developed computational techniques to anisotropic antiferromagnets. This scheme is shown to give quantitatively good results for the two dimensional square lattice Heisenberg AFM. The results have definite implications for a spinon theory of quantum antiferromagnets. Finally he studies flux phases and other variational prescriptions for obtaining low lying states of the Hubbard model

  7. A signal detection-item response theory model for evaluating neuropsychological measures.

    Science.gov (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G

    2018-02-05

    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  8. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  9. A Synthetic Fusion Rule for Salient Region Detection under the Framework of DS-Evidence Theory

    Directory of Open Access Journals (Sweden)

    Naeem Ayoub

    2018-05-01

    Full Text Available Saliency detection is one of the most valuable research topics in computer vision. It focuses on the detection of the most significant objects/regions in images and reduces the computational time cost of getting the desired information from salient regions. Local saliency detection or common pattern discovery schemes were actively used by the researchers to overcome the saliency detection problems. In this paper, we propose a bottom-up saliency fusion method by taking into consideration the importance of the DS-Evidence (Dempster–Shafer (DS theory. Firstly, we calculate saliency maps from different algorithms based on the pixels-level, patches-level and region-level methods. Secondly, we fuse the pixels based on the foreground and background information under the framework of DS-Evidence theory (evidence theory allows one to combine evidence from different sources and arrive at a degree of belief that takes into account all the available evidence. The development inclination of image saliency detection through DS-Evidence theory gives us better results for saliency prediction. Experiments are conducted on the publicly available four different datasets (MSRA, ECSSD, DUT-OMRON and PASCAL-S. Our saliency detection method performs well and shows prominent results as compared to the state-of-the-art algorithms.

  10. Field theory and the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Dudas, E [Orsay, LPT (France)

    2014-07-01

    This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.

  11. Chern-Simons Theory, Matrix Models, and Topological Strings

    International Nuclear Information System (INIS)

    Walcher, J

    2006-01-01

    This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may

  12. Theory, evidence and Intervention Mapping to improve behavior nutrition and physical activity interventions.

    OpenAIRE

    Brug, Hans; Oenema, Anke; Ferreira, Isabel

    2005-01-01

    Abstract Background The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. Discussion Since behavior theory is a reflection of the compiled evidence of behavior research, theory is the only foothold we have for the development of behavioral nutrition and physical activity interventions. Application of theory should improve the effectiveness of interventions. However, some of the the...

  13. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  14. Experimental evidence and modelling of drought induced alternative stable soil moisture states

    Science.gov (United States)

    Robinson, David; Jones, Scott; Lebron, Inma; Reinsch, Sabine; Dominguez, Maria; Smith, Andrew; Marshal, Miles; Emmett, Bridget

    2017-04-01

    The theory of alternative stable states in ecosystems is well established in ecology; however, evidence from manipulation experiments supporting the theory is limited. Developing the evidence base is important because it has profound implications for ecosystem management. Here we show evidence of the existence of alternative stable soil moisture states induced by drought in an upland wet heath. We used a long-term (15 yrs) climate change manipulation experiment with moderate sustained drought, which reduced the ability of the soil to retain soil moisture by degrading the soil structure, reducing moisture retention. Moreover, natural intense droughts superimposed themselves on the experiment, causing an unexpected additional alternative soil moisture state to develop, both for the drought manipulation and control plots; this impaired the soil from rewetting in winter. Our results show the coexistence of three stable states. Using modelling with the Hydrus 1D software package we are able to show the circumstances under which shifts in soil moisture states are likely to occur. Given the new understanding it presents a challenge of how to incorporate feedbacks, particularly related to soil structure, into soil flow and transport models?

  15. Modeling self on others: An import theory of subjectivity and selfhood.

    Science.gov (United States)

    Prinz, Wolfgang

    2017-03-01

    This paper outlines an Import Theory of subjectivity and selfhood. Import theory claims that subjectivity is initially perceived as a key feature of other minds before it then becomes imported from other minds to own minds whereby it lays the ground for mental selfhood. Import theory builds on perception-production matching, which in turn draws on both representational mechanisms and social practices. Representational mechanisms rely on common coding of perception and production. Social practices rely on action mirroring in dyadic interactions. The interplay between mechanisms and practices gives rise to model self on others. Individuals become intentional agents in virtue of perceiving others mirroring themselves. The outline of the theory is preceded by an introductory section that locates import theory in the broader context of competing approaches, and it is followed by a concluding section that assesses import theory in terms of empirical evidence and explanatory power. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Lattice models and conformal field theories

    International Nuclear Information System (INIS)

    Saleur, H.

    1988-01-01

    Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied

  17. Qigong in Cancer Care: Theory, Evidence-Base, and Practice

    Directory of Open Access Journals (Sweden)

    Penelope Klein

    2017-01-01

    Full Text Available Background: The purpose of this discussion is to explore the theory, evidence base, and practice of Qigong for individuals with cancer. Questions addressed are: What is qigong? How does it work? What evidence exists supporting its practice in integrative oncology? What barriers to wide-spread programming access exist? Methods: Sources for this discussion include a review of scholarly texts, the Internet, PubMed, field observations, and expert opinion. Results: Qigong is a gentle, mind/body exercise integral within Chinese medicine. Theoretical foundations include Chinese medicine energy theory, psychoneuroimmunology, the relaxation response, the meditation effect, and epigenetics. Research supports positive effects on quality of life (QOL, fatigue, immune function and cortisol levels, and cognition for individuals with cancer. There is indirect, scientific evidence suggesting that qigong practice may positively influence cancer prevention and survival. No one Qigong exercise regimen has been established as superior. Effective protocols do have common elements: slow mindful exercise, easy to learn, breath regulation, meditation, emphasis on relaxation, and energy cultivation including mental intent and self-massage. Conclusions: Regular practice of Qigong exercise therapy has the potential to improve cancer-related QOL and is indirectly linked to cancer prevention and survival. Wide-spread access to quality Qigong in cancer care programming may be challenged by the availability of existing programming and work force capacity.

  18. Children balance theories and evidence in exploration, explanation, and learning

    NARCIS (Netherlands)

    Bonawitz, E.B.; van Schijndel, T.J.P.; Friel, D.; Schulz, L.

    2012-01-01

    We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children,

  19. Gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Witten, E.

    1989-01-01

    Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)

  20. An Ensemble Deep Convolutional Neural Network Model with Improved D-S Evidence Fusion for Bearing Fault Diagnosis.

    Science.gov (United States)

    Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang; Hu, Jianjun

    2017-07-28

    Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster-Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions.

  1. What is the Role of Legal Systems in Financial Intermediation? Theory and Evidence

    NARCIS (Netherlands)

    Bottazzi, L.; Da Rin, M.; Hellmann, T.

    2008-01-01

    We develop a theory and empirical test of how the legal system affects the relationship between venture capitalists and entrepreneurs. The theory uses a double moral hazard framework to show how optimal contracts and investor actions depend on the quality of the legal system. The empirical evidence

  2. The Current Evidence for Hayek’s Cultural Group Selection Theory

    Directory of Open Access Journals (Sweden)

    Brad Lowell Stone

    2010-12-01

    Full Text Available In this article I summarize Friedrich Hayek’s cultural group selection theory and describe the evidence gathered by current cultural group selection theorists within the behavioral and social sciences supporting Hayek’s main assertions. I conclude with a few comments on Hayek and libertarianism.

  3. Synthetic Domain Theory and Models of Linear Abadi & Plotkin Logic

    DEFF Research Database (Denmark)

    Møgelberg, Rasmus Ejlers; Birkedal, Lars; Rosolini, Guiseppe

    2008-01-01

    Plotkin suggested using a polymorphic dual intuitionistic/linear type theory (PILLY) as a metalanguage for parametric polymorphism and recursion. In recent work the first two authors and R.L. Petersen have defined a notion of parametric LAPL-structure, which are models of PILLY, in which one can...... reason using parametricity and, for example, solve a large class of domain equations, as suggested by Plotkin.In this paper, we show how an interpretation of a strict version of Bierman, Pitts and Russo's language Lily into synthetic domain theory presented by Simpson and Rosolini gives rise...... to a parametric LAPL-structure. This adds to the evidence that the notion of LAPL-structure is a general notion, suitable for treating many different parametric models, and it provides formal proofs of consequences of parametricity expected to hold for the interpretation. Finally, we show how these results...

  4. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2001-01-01

    , in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...

  5. Quiver gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Yagi, Junya

    2015-01-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  6. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  7. Economic Modelling in Institutional Economic Theory

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski

    2017-06-01

    Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.

  8. Integrating Norm Activation Model and Theory of Planned Behavior to Understand Sustainable Transport Behavior: Evidence from China.

    Science.gov (United States)

    Liu, Yuwei; Sheng, Hong; Mundorf, Norbert; Redding, Colleen; Ye, Yinjiao

    2017-12-18

    With increasing urbanization in China, many cities are facing serious environmental problems due to continuous and substantial increase in automobile transportation. It is becoming imperative to examine effective ways to reduce individual automobile use to facilitate sustainable transportation behavior. Empirical, theory-based research on sustainable transportation in China is limited. In this research, we propose an integrated model based on the norm activation model and the theory of planned behavior by combining normative and rational factors to predict individuals' intention to reduce car use. Data from a survey of 600 car drivers in China's three metropolitan areas was used to test the proposed model and hypotheses. Results showed that three variables, perceived norm of car-transport reduction, attitude towards reduction, and perceived behavior control over car-transport reduction, significantly affected the intention to reduce car-transport. Personal norms mediated the relationship between awareness of consequences of car-transport, ascription of responsibility of car-transport, perceived subjective norm for car-transport reduction, and intention to reduce car-transport. The results of this research not only contribute to theory development in the area of sustainable transportation behavior, but also provide a theoretical frame of reference for relevant policy-makers in urban transport management.

  9. Integrating Norm Activation Model and Theory of Planned Behavior to Understand Sustainable Transport Behavior: Evidence from China

    Directory of Open Access Journals (Sweden)

    Yuwei Liu

    2017-12-01

    Full Text Available With increasing urbanization in China, many cities are facing serious environmental problems due to continuous and substantial increase in automobile transportation. It is becoming imperative to examine effective ways to reduce individual automobile use to facilitate sustainable transportation behavior. Empirical, theory-based research on sustainable transportation in China is limited. In this research, we propose an integrated model based on the norm activation model and the theory of planned behavior by combining normative and rational factors to predict individuals’ intention to reduce car use. Data from a survey of 600 car drivers in China’s three metropolitan areas was used to test the proposed model and hypotheses. Results showed that three variables, perceived norm of car-transport reduction, attitude towards reduction, and perceived behavior control over car-transport reduction, significantly affected the intention to reduce car-transport. Personal norms mediated the relationship between awareness of consequences of car-transport, ascription of responsibility of car-transport, perceived subjective norm for car-transport reduction, and intention to reduce car-transport. The results of this research not only contribute to theory development in the area of sustainable transportation behavior, but also provide a theoretical frame of reference for relevant policy-makers in urban transport management.

  10. Application of the pertubation theory to a two channels model for sensitivity calculations in PWR cores

    International Nuclear Information System (INIS)

    Oliveira, A.C.J.G. de; Andrade Lima, F.R. de

    1989-01-01

    The present work is an application of the perturbation theory (Matricial formalism) to a simplified two channels model, for sensitivity calculations in PWR cores. Expressions for some sensitivity coefficients of thermohydraulic interest were developed from the proposed model. The code CASNUR.FOR was written in FORTRAN to evaluate these sensitivity coefficients. The comparison between results obtained from the matrical formalism of pertubation theory with those obtained directly from the two channels model, makes evident the efficiency and potentiality of this perturbation method for nuclear reactor cores sensitivity calculations. (author) [pt

  11. Warped models in string theory

    International Nuclear Information System (INIS)

    Acharya, B.S.; Benini, F.; Valandro, R.

    2006-12-01

    Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)

  12. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  13. On low rank classical groups in string theory, gauge theory and matrix models

    International Nuclear Information System (INIS)

    Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun

    2004-01-01

    We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature

  14. Testing a self-determination theory model of children's physical activity motivation: a cross-sectional study.

    Science.gov (United States)

    Sebire, Simon J; Jago, Russell; Fox, Kenneth R; Edwards, Mark J; Thompson, Janice L

    2013-09-26

    Understanding children's physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children's physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children's minutes in moderate-to-vigorous physical activity. The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children's motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical

  15. An evidence accumulation model for conflict detection performance in a simulated air traffic control task.

    Science.gov (United States)

    Neal, Andrew; Kwantes, Peter J

    2009-04-01

    The aim of this article is to develop a formal model of conflict detection performance. Our model assumes that participants iteratively sample evidence regarding the state of the world and accumulate it over time. A decision is made when the evidence reaches a threshold that changes over time in response to the increasing urgency of the task. Two experiments were conducted to examine the effects of conflict geometry and timing on response proportions and response time. The model is able to predict the observed pattern of response times, including a nonmonotonic relationship between distance at point of closest approach and response time, as well as effects of angle of approach and relative velocity. The results demonstrate that evidence accumulation models provide a good account of performance on a conflict detection task. Evidence accumulation models are a form of dynamic signal detection theory, allowing for the analysis of response times as well as response proportions, and can be used for simulating human performance on dynamic decision tasks.

  16. Barriers and facilitators to replicating an evidence-based palliative care model.

    Science.gov (United States)

    Davis, E Maxwell; Jamison, Paula; Brumley, Richard; Enguídanos, Susan

    2006-01-01

    Recognition of the difficulties involved in replicating evidence- based interventions is well documented in the literature within the medical field. Promising research findings are often not translated into practice, and if they are, there is a significant time gap between study conclusion and practice adoption. The purpose of this article is to describe the barriers and facilitators encountered by two managed care organizations while replicating an evidence-based end of life in-home palliative care model. Using Diffusion of Innovation Theory as a theoretical framework, results from focus groups and interviews with the project's clinical, administrative and research teams are presented and recommendations made for improving translational efforts. The process of replicating the end of life in-home palliative care model clearly illustrated the key elements required for successfully diffusing innovation. These key elements include marketing and communication, leadership, organizational support and training and mentorship. This qualitative process study provides clear, real world perspectives of the myriad of challenges encountered in replicating an evidence-based project.

  17. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  18. Designing theoretically-informed implementation interventions: Fine in theory, but evidence of effectiveness in practice is needed

    Directory of Open Access Journals (Sweden)

    Reeves Scott

    2006-02-01

    Full Text Available Abstract The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.

  19. A course on basic model theory

    CERN Document Server

    Sarbadhikari, Haimanti

    2017-01-01

    This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.

  20. Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient

    Directory of Open Access Journals (Sweden)

    Fengjian Shi

    2017-10-01

    Full Text Available In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster–Shafer evidence theory (D–S theory has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D–S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method.

  1. Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient.

    Science.gov (United States)

    Shi, Fengjian; Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua

    2017-10-16

    In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster-Shafer evidence theory (D-S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D-S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method.

  2. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  3. The pipe model theory half a century on: a review.

    Science.gov (United States)

    Lehnebach, Romain; Beyer, Robert; Letort, Véronique; Heuret, Patrick

    2018-01-23

    More than a half century ago, Shinozaki et al. (Shinozaki K, Yoda K, Hozumi K, Kira T. 1964b. A quantitative analysis of plant form - the pipe model theory. II. Further evidence of the theory and its application in forest ecology. Japanese Journal of Ecology14: 133-139) proposed an elegant conceptual framework, the pipe model theory (PMT), to interpret the observed linear relationship between the amount of stem tissue and corresponding supported leaves. The PMT brought a satisfactory answer to two vividly debated problems that were unresolved at the moment of its publication: (1) What determines tree form and which rules drive biomass allocation to the foliar versus stem compartments in plants? (2) How can foliar area or mass in an individual plant, in a stand or at even larger scales be estimated? Since its initial formulation, the PMT has been reinterpreted and used in applications, and has undoubtedly become an important milestone in the mathematical interpretation of plant form and functioning. This article aims to review the PMT by going back to its initial formulation, stating its explicit and implicit properties and discussing them in the light of current biological knowledge and experimental evidence in order to identify the validity and range of applicability of the theory. We also discuss the use of the theory in tree biomechanics and hydraulics as well as in functional-structural plant modelling. Scrutinizing the PMT in the light of modern biological knowledge revealed that most of its properties are not valid as a general rule. The hydraulic framework derived from the PMT has attracted much more attention than its mechanical counterpart and implies that only the conductive portion of a stem cross-section should be proportional to the supported foliage amount rather than the whole of it. The facts that this conductive portion is experimentally difficult to measure and varies with environmental conditions and tree ontogeny might cause the commonly

  4. Using health psychology to help patients: theories of behaviour change.

    Science.gov (United States)

    Barley, Elizabeth; Lawson, Victoria

    2016-09-08

    Behaviour change theories and related research evidence highlight the complexity of making and sticking to health-related behaviour changes. These theories make explicit factors that influence behaviour change, such as health beliefs, past behaviour, intention, social influences, perceived control and the context of the behaviour. Nurses can use this information to understand why a particular patient may find making recommended health behaviour changes difficult and to determine factors that may help them. This article outlines five well-established theories of behaviour change: the health belief model, the theory of planned behaviour, the stages of change model, self-determination theory, and temporal self-regulation theory. The evidence for interventions that are informed by these theories is then explored and appraised. The extent and quality of evidence varies depending on the type of behaviour and patients targeted, but evidence from randomised controlled trials indicates that interventions informed by theory can result in behaviour change.

  5. An integrated model of clinical reasoning: dual-process theory of cognition and metacognition.

    Science.gov (United States)

    Marcum, James A

    2012-10-01

    Clinical reasoning is an important component for providing quality medical care. The aim of the present paper is to develop a model of clinical reasoning that integrates both the non-analytic and analytic processes of cognition, along with metacognition. The dual-process theory of cognition (system 1 non-analytic and system 2 analytic processes) and the metacognition theory are used to develop an integrated model of clinical reasoning. In the proposed model, clinical reasoning begins with system 1 processes in which the clinician assesses a patient's presenting symptoms, as well as other clinical evidence, to arrive at a differential diagnosis. Additional clinical evidence, if necessary, is acquired and analysed utilizing system 2 processes to assess the differential diagnosis, until a clinical decision is made diagnosing the patient's illness and then how best to proceed therapeutically. Importantly, the outcome of these processes feeds back, in terms of metacognition's monitoring function, either to reinforce or to alter cognitive processes, which, in turn, enhances synergistically the clinician's ability to reason quickly and accurately in future consultations. The proposed integrated model has distinct advantages over other models proposed in the literature for explicating clinical reasoning. Moreover, it has important implications for addressing the paradoxical relationship between experience and expertise, as well as for designing a curriculum to teach clinical reasoning skills. © 2012 Blackwell Publishing Ltd.

  6. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  7. Constraint theory multidimensional mathematical model management

    CERN Document Server

    Friedman, George J

    2017-01-01

    Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...

  8. Testing a self-determination theory model of children’s physical activity motivation: a cross-sectional study

    Science.gov (United States)

    2013-01-01

    Background Understanding children’s physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children’s physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Methods Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. Results The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children’s minutes in moderate-to-vigorous physical activity. Conclusions The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children’s motivation that is based on enjoyment and inherent satisfaction of physical activity is

  9. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  10. Neutron Star Models in Alternative Theories of Gravity

    Science.gov (United States)

    Manolidis, Dimitrios

    We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.

  11. Matrix String Theory

    CERN Document Server

    Dijkgraaf, R; Verlinde, Herman L

    1997-01-01

    Via compactification on a circle, the matrix model of M-theory proposed by Banks et al suggests a concrete identification between the large N limit of two-dimensional N=8 supersymmetric Yang-Mills theory and type IIA string theory. In this paper we collect evidence that supports this identification. We explicitly identify the perturbative string states and their interactions, and describe the appearance of D-particle and D-membrane states.

  12. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster-Shafer Evidence Theory.

    Science.gov (United States)

    Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang

    2017-08-28

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.

  13. Aligning Theory and Design: The Development of an Online Learning Intervention to Teach Evidence-based Practice for Maximal Reach.

    Science.gov (United States)

    Delagran, Louise; Vihstadt, Corrie; Evans, Roni

    2015-09-01

    Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development.

  14. M(atrix) theory: matrix quantum mechanics as a fundamental theory

    International Nuclear Information System (INIS)

    Taylor, Washington

    2001-01-01

    This article reviews the matrix model of M theory. M theory is an 11-dimensional quantum theory of gravity that is believed to underlie all superstring theories. M theory is currently the most plausible candidate for a theory of fundamental physics which reconciles gravity and quantum field theory in a realistic fashion. Evidence for M theory is still only circumstantial -- no complete background-independent formulation of the theory exists as yet. Matrix theory was first developed as a regularized theory of a supersymmetric quantum membrane. More recently, it has appeared in a different guise as the discrete light-cone quantization of M theory in flat space. These two approaches to matrix theory are described in detail and compared. It is shown that matrix theory is a well-defined quantum theory that reduces to a supersymmetric theory of gravity at low energies. Although its fundamental degrees of freedom are essentially pointlike, higher-dimensional fluctuating objects (branes) arise through the non-Abelian structure of the matrix degrees of freedom. The problem of formulating matrix theory in a general space-time background is discussed, and the connections between matrix theory and other related models are reviewed

  15. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  16. Substandard model? At last, a good reason to opt for a sexier theory of particle physics

    CERN Multimedia

    Cho, A

    2001-01-01

    According to experimenters at Brookhaven, a tiny discrepancy in the magnetism of the muon may signal a crack in the Standard Model. The deviation could be the first piece of hard evidence for a more complete theory called supersymmetry (1 page).

  17. Vertical Integration of Hospitals and Physicians: Economic Theory and Empirical Evidence on Spending and Quality.

    Science.gov (United States)

    Post, Brady; Buchmueller, Tom; Ryan, Andrew M

    2017-08-01

    Hospital-physician vertical integration is on the rise. While increased efficiencies may be possible, emerging research raises concerns about anticompetitive behavior, spending increases, and uncertain effects on quality. In this review, we bring together several of the key theories of vertical integration that exist in the neoclassical and institutional economics literatures and apply these theories to the hospital-physician relationship. We also conduct a literature review of the effects of vertical integration on prices, spending, and quality in the growing body of evidence ( n = 15) to evaluate which of these frameworks have the strongest empirical support. We find some support for vertical foreclosure as a framework for explaining the observed results. We suggest a conceptual model and identify directions for future research. Based on our analysis, we conclude that vertical integration poses a threat to the affordability of health services and merits special attention from policymakers and antitrust authorities.

  18. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  19. The exercise and affect relationship: evidence for the dual-mode model and a modified opponent process theory.

    Science.gov (United States)

    Markowitz, Sarah M; Arent, Shawn M

    2010-10-01

    This study examined the relationship between exertion level and affect using the framework of opponent-process theory and the dual-mode model, with the Activation-Deactivation Adjective Checklist and the State Anxiety Inventory among 14 active and 14 sedentary participants doing 20 min of treadmill exercise at speeds of 5% below, 5% above, and at lactate threshold (LT). We found a significant effect of time, condition, Time × Condition, and Time × Group, but no group, Group × Condition, or Time × Group × Condition effects, such that the 5% above LT condition produced a worsening of affect in-task compared with all other conditions whereas, across conditions, participants experienced in-task increases in energy and tension, and in-task decreases in tiredness and calmness relative to baseline. Posttask, participants experienced mood improvement (decreased tension, anxiety, and increased calmness) across conditions, with a 30-min delay in the above LT condition. These results partially support the dual-mode model and a modified opponent-process theory.

  20. Practice Evaluation Strategies Among Social Workers: Why an Evidence-Informed Dual-Process Theory Still Matters.

    Science.gov (United States)

    Davis, Thomas D

    2017-01-01

    Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.

  1. From 6D superconformal field theories to dynamic gauged linear sigma models

    Science.gov (United States)

    Apruzzi, Fabio; Hassler, Falk; Heckman, Jonathan J.; Melnikov, Ilarion V.

    2017-09-01

    Compactifications of six-dimensional (6D) superconformal field theories (SCFTs) on four- manifolds generate a large class of novel two-dimensional (2D) quantum field theories. We consider in detail the case of the rank-one simple non-Higgsable cluster 6D SCFTs. On the tensor branch of these theories, the gauge group is simple and there are no matter fields. For compactifications on suitably chosen Kähler surfaces, we present evidence that this provides a method to realize 2D SCFTs with N =(0 ,2 ) supersymmetry. In particular, we find that reduction on the tensor branch of the 6D SCFT yields a description of the same 2D fixed point that is described in the UV by a gauged linear sigma model (GLSM) in which the parameters are promoted to dynamical fields, that is, a "dynamic GLSM" (DGLSM). Consistency of the model requires the DGLSM to be coupled to additional non-Lagrangian sectors obtained from reduction of the antichiral two-form of the 6D theory. These extra sectors include both chiral and antichiral currents, as well as spacetime filling noncritical strings of the 6D theory. For each candidate 2D SCFT, we also extract the left- and right-moving central charges in terms of data of the 6D SCFT and the compactification manifold.

  2. On the algebraic theory of kink sectors: Application to quantum field theory models and collision theory

    International Nuclear Information System (INIS)

    Schlingemann, D.

    1996-10-01

    Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)

  3. Pathogenesis of chronic pancreatitis: an evidence-based review of past theories and recent developments.

    Science.gov (United States)

    Stevens, Tyler; Conwell, Darwin L; Zuccaro, Gregory

    2004-11-01

    In the past several decades, four prominent theories of chronic pancreatitis pathogenesis have emerged: the toxic-metabolic theory, the oxidative stress hypothesis, the stone and duct obstruction theory, and the necrosis-fibrosis hypothesis. Although these traditional theories are formulated based on compelling scientific observations, substantial contradictory data also exist for each. Furthermore, the basic premises of some of these theories are directly contradictory. Because of the recent scientific progress in the underlying genetic, cellular, and molecular pathophysiology, there have been substantial advances in the understanding of chronic pancreatitis pathogenesis. This paper will provide an evidence-based review and critique of the traditional pathogenic theories, followed by a discussion of the new advances in pancreatic fibrogenesis. Moreover, we will discuss plausible pathogenic sequences applied to each of the known etiologies.

  4. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 2: Building a Culture of Inquiry

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-09-01

    Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action

  5. Toric Methods in F-Theory Model Building

    Directory of Open Access Journals (Sweden)

    Johanna Knapp

    2011-01-01

    Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.

  6. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    Science.gov (United States)

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  7. Event-related potential evidence for the processing efficiency theory.

    Science.gov (United States)

    Murray, N P; Janelle, C M

    2007-01-15

    The purpose of this study was to examine the central tenets of the processing efficiency theory using psychophysiological measures of attention and effort. Twenty-eight participants were divided equally into either a high or low trait anxiety group. They were then required to perform a simulated driving task while responding to one of four target light-emitting diodes. Cortical activity and dual task performance were recorded under two conditions -- baseline and competition -- with cognitive anxiety being elevated in the competitive session by an instructional set. Although driving speed was similar across sessions, a reduction in P3 amplitude to cue onset in the light detection task occurred for both groups during the competitive session, suggesting a reduction in processing efficiency as participants became more state anxious. Our findings provide more comprehensive and mechanistic evidence for processing efficiency theory, and confirm that increases in cognitive anxiety can result in a reduction of processing efficiency with little change in performance effectiveness.

  8. Non-linear σ-models and string theories

    International Nuclear Information System (INIS)

    Sen, A.

    1986-10-01

    The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs

  9. Theory, evidence and Intervention Mapping to improve behavior nutrition and physical activity interventions

    Directory of Open Access Journals (Sweden)

    Ferreira Isabel

    2005-04-01

    Full Text Available Abstract Background The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. Discussion Since behavior theory is a reflection of the compiled evidence of behavior research, theory is the only foothold we have for the development of behavioral nutrition and physical activity interventions. Application of theory should improve the effectiveness of interventions. However, some of the theories we use lack a strong empirical foundation, and the available theories are not always used in the most effective way. Furthermore, many of the commonly-used theories provide at best information on what needs to be changed to promote healthy behavior, but not on how changes can be induced. Finally, many theories explain behavioral intentions or motivation rather well, but are less well-suited to explaining or predicting actual behavior or behavior change. For more effective interventions, behavior change theory needs to be further developed in stronger research designs and such change-theory should especially focus on how to promote action rather than mere motivation. Since voluntary behavior change requires motivation, ability as well as the opportunity to change, further development of behavior change theory should incorporate environmental change strategies. Conclusion Intervention Mapping may help to further improve the application of theories in nutrition and physical activity behavior change.

  10. Statistical field theory of futures commodity prices

    Science.gov (United States)

    Baaquie, Belal E.; Yu, Miao

    2018-02-01

    The statistical theory of commodity prices has been formulated by Baaquie (2013). Further empirical studies of single (Baaquie et al., 2015) and multiple commodity prices (Baaquie et al., 2016) have provided strong evidence in support the primary assumptions of the statistical formulation. In this paper, the model for spot prices (Baaquie, 2013) is extended to model futures commodity prices using a statistical field theory of futures commodity prices. The futures prices are modeled as a two dimensional statistical field and a nonlinear Lagrangian is postulated. Empirical studies provide clear evidence in support of the model, with many nontrivial features of the model finding unexpected support from market data.

  11. Lectures on algebraic model theory

    CERN Document Server

    Hart, Bradd

    2001-01-01

    In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.

  12. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...

  13. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  14. Beyond ROC Curvature: Strength Effects and Response Time Data Support Continuous-Evidence Models of Recognition Memory

    Science.gov (United States)

    Dube, Chad; Starns, Jeffrey J.; Rotello, Caren M.; Ratcliff, Roger

    2012-01-01

    A classic question in the recognition memory literature is whether retrieval is best described as a continuous-evidence process consistent with signal detection theory (SDT), or a threshold process consistent with many multinomial processing tree (MPT) models. Because receiver operating characteristics (ROCs) based on confidence ratings are…

  15. Chern-Simons matrix models, two-dimensional Yang-Mills theory and the Sutherland model

    International Nuclear Information System (INIS)

    Szabo, Richard J; Tierz, Miguel

    2010-01-01

    We derive some new relationships between matrix models of Chern-Simons gauge theory and of two-dimensional Yang-Mills theory. We show that q-integration of the Stieltjes-Wigert matrix model is the discrete matrix model that describes q-deformed Yang-Mills theory on S 2 . We demonstrate that the semiclassical limit of the Chern-Simons matrix model is equivalent to the Gross-Witten model in the weak-coupling phase. We study the strong-coupling limit of the unitary Chern-Simons matrix model and show that it too induces the Gross-Witten model, but as a first-order deformation of Dyson's circular ensemble. We show that the Sutherland model is intimately related to Chern-Simons gauge theory on S 3 , and hence to q-deformed Yang-Mills theory on S 2 . In particular, the ground-state wavefunction of the Sutherland model in its classical equilibrium configuration describes the Chern-Simons free energy. The correspondence is extended to Wilson line observables and to arbitrary simply laced gauge groups.

  16. A diffusion decision model analysis of evidence variability in the lexical decision task.

    Science.gov (United States)

    Tillman, Gabriel; Osth, Adam F; van Ravenzwaaij, Don; Heathcote, Andrew

    2017-12-01

    The lexical-decision task is among the most commonly used paradigms in psycholinguistics. In both the signal-detection theory and Diffusion Decision Model (DDM; Ratcliff, Gomez, & McKoon, Psychological Review, 111, 159-182, 2004) frameworks, lexical-decisions are based on a continuous source of word-likeness evidence for both words and non-words. The Retrieving Effectively from Memory model of Lexical-Decision (REM-LD; Wagenmakers et al., Cognitive Psychology, 48(3), 332-367, 2004) provides a comprehensive explanation of lexical-decision data and makes the prediction that word-likeness evidence is more variable for words than non-words and that higher frequency words are more variable than lower frequency words. To test these predictions, we analyzed five lexical-decision data sets with the DDM. For all data sets, drift-rate variability changed across word frequency and non-word conditions. For the most part, REM-LD's predictions about the ordering of evidence variability across stimuli in the lexical-decision task were confirmed.

  17. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  18. Remarks on “A new non-specificity measure in evidence theory based on belief intervals”

    Directory of Open Access Journals (Sweden)

    Joaquín ABELLÁN

    2018-03-01

    Full Text Available Two types of uncertainty co-exist in the theory of evidence: discord and non-specificity. From 90s, many mathematical expressions have arisen to quantify these two parts in an evidence. An important aspect of each measure presented is the verification of a coherent set of properties. About non-specificity, so far only one measure verifies an important set of those properties. Very recently, a new measure of non-specificity based on belief intervals has been presented as an alternative measure that quantifies a similar set of properties (Yang et al., 2016. It is shown that the new measure really does not verify two of those important properties. Some errors have been found in their corresponding proofs in the original publication. Keywords: Additivity, Imprecise probabilities, Non-specificity, Subadditivity, Theory of evidence, Uncertainty measures

  19. Latent factor modeling of four schizotypy dimensions with theory of mind and empathy.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Bedwell

    Full Text Available Preliminary evidence suggests that theory of mind and empathy relate differentially to factors of schizotypy. The current study assessed 686 undergraduate students and used structural equation modeling to examine links between a four-factor model of schizotypy with performance on measures of theory of mind (Reading the Mind in the Eyes Test [MIE] and empathy (Interpersonal Reactivity Index [IRI]. Schizotypy was assessed using three self-report measures which were simultaneously entered into the model. Results revealed that the Negative factor of schizotypy showed a negative relationship with the Empathy factor, which was primarily driven by the Empathic Concern subscale of the IRI and the No Close Friends and Constricted Affect subscales of the Schizotypal Personality Questionnaire. These findings are consistent with a growing body of literature suggesting a relatively specific relationship between negative schizotypy and empathy, and are consistent with several previous studies that found no relationship between MIE performance and schizotypy.

  20. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  1. Poisson-Boltzmann theory of charged colloids: limits of the cell model for salty suspensions

    International Nuclear Information System (INIS)

    Denton, A R

    2010-01-01

    Thermodynamic properties of charge-stabilized colloidal suspensions and polyelectrolyte solutions are commonly modelled by implementing the mean-field Poisson-Boltzmann (PB) theory within a cell model. This approach models a bulk system by a single macroion, together with counterions and salt ions, confined to a symmetrically shaped, electroneutral cell. While easing numerical solution of the nonlinear PB equation, the cell model neglects microion-induced interactions and correlations between macroions, precluding modelling of macroion ordering phenomena. An alternative approach, which avoids the artificial constraints of cell geometry, exploits the mapping of a macroion-microion mixture onto a one-component model of pseudo-macroions governed by effective interparticle interactions. In practice, effective-interaction models are usually based on linear-screening approximations, which can accurately describe strong nonlinear screening only by incorporating an effective (renormalized) macroion charge. Combining charge renormalization and linearized PB theories, in both the cell model and an effective-interaction (cell-free) model, we compute osmotic pressures of highly charged colloids and monovalent microions, in Donnan equilibrium with a salt reservoir, over a range of concentrations. By comparing predictions with primitive model simulation data for salt-free suspensions, and with predictions from nonlinear PB theory for salty suspensions, we chart the limits of both the cell model and linear-screening approximations in modelling bulk thermodynamic properties. Up to moderately strong electrostatic couplings, the cell model proves accurate for predicting osmotic pressures of deionized (counterion-dominated) suspensions. With increasing salt concentration, however, the relative contribution of macroion interactions to the osmotic pressure grows, leading predictions from the cell and effective-interaction models to deviate. No evidence is found for a liquid

  2. Weighted score-level feature fusion based on Dempster-Shafer evidence theory for action recognition

    Science.gov (United States)

    Zhang, Guoliang; Jia, Songmin; Li, Xiuzhi; Zhang, Xiangyin

    2018-01-01

    The majority of human action recognition methods use multifeature fusion strategy to improve the classification performance, where the contribution of different features for specific action has not been paid enough attention. We present an extendible and universal weighted score-level feature fusion method using the Dempster-Shafer (DS) evidence theory based on the pipeline of bag-of-visual-words. First, the partially distinctive samples in the training set are selected to construct the validation set. Then, local spatiotemporal features and pose features are extracted from these samples to obtain evidence information. The DS evidence theory and the proposed rule of survival of the fittest are employed to achieve evidence combination and calculate optimal weight vectors of every feature type belonging to each action class. Finally, the recognition results are deduced via the weighted summation strategy. The performance of the established recognition framework is evaluated on Penn Action dataset and a subset of the joint-annotated human metabolome database (sub-JHMDB). The experiment results demonstrate that the proposed feature fusion method can adequately exploit the complementarity among multiple features and improve upon most of the state-of-the-art algorithms on Penn Action and sub-JHMDB datasets.

  3. Narrative theories as computational models: reader-oriented theory and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Galloway, P.

    1983-12-01

    In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.

  4. Generalized algebra-valued models of set theory

    NARCIS (Netherlands)

    Löwe, B.; Tarafder, S.

    2015-01-01

    We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.

  5. Optimization models using fuzzy sets and possibility theory

    CERN Document Server

    Orlovski, S

    1987-01-01

    Optimization is of central concern to a number of discip­ lines. Operations Research and Decision Theory are often consi­ dered to be identical with optimizationo But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i. e. the solutions were considered to be either fea­ sible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeller to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real prob­ lems. This is particularly true if the problem under considera­ tion includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if na­ tural language has to be...

  6. Application of Dempster-Shafer theory of evidence model to geoelectric and hydraulic parameters for groundwater potential zonation

    Directory of Open Access Journals (Sweden)

    Kehinde Anthony Mogaji

    2018-06-01

    Full Text Available The application of a GIS – based Dempster – Shafer data driven model named as evidential belief function EBF- methodology to groundwater potential conditioning factors (GPCFs derived from geophysical and hydrogeological data sets for assessing groundwater potentiality was presented in this study. The proposed method’s efficacy in managing degree of uncertainty in spatial predictive models motivated this research. The method procedural approaches entail firstly, the database containing groundwater data records (bore wells location inventory, hydrogeological data record, etc. and geophysical measurement data construction. From the database, different influencing groundwater occurrence factors, namely aquifer layer thickness, aquifer layer resistivity, overburden material resistivity, overburden material thickness, aquifer hydraulic conductivity and aquifer transmissivity were extracted and prepared. Further, the bore well location inventories were partitioned randomly into a ratio of 70% (19 wells for model training and 30% (9 wells for model testing. The synthesized of the GPCFs via applying the DS – EBF model algorithms produced the groundwater productivity potential index (GPPI map which demarcated the area into low – medium, medium, medium – high and high potential zones. The analyzed percentage degree of uncertainty for the predicted lows potential zones classes and mediums/highs potential zones classes are >10% and <10%, respectively. The DS theory model-based GPPI map’s validation through ROC approach established prediction rate accuracy of 88.8%. Successively, the determined transverse resistance (TR values in the range of 1280 and 30,000 Ω my for the area geoelectrically delineated aquifer units of the predicted potential zones through Dar – Zarrouk Parameter analysis quantitatively confirm the DS theory modeling prediction results. This research results have expand the capability of DS – EBF model in predictive

  7. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  8. Constructing a New Theory from Old Ideas and New Evidence

    Science.gov (United States)

    Rhodes, Marjorie; Wellman, Henry

    2013-01-01

    A central tenet of constructivist models of conceptual development is that children's initial conceptual level constrains how they make sense of new evidence and thus whether exposure to evidence will prompt conceptual change. Yet little experimental evidence directly examines this claim for the case of sustained, fundamental conceptual…

  9. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website.

    Science.gov (United States)

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; White, Peter; Yardley, Lucy; Bishop, Felicity L

    2016-06-10

    According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative 'think aloud' study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients' stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants' experiences of using the website. We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes

  10. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  11. From Landau's hydrodynamical model to field theory model to field theory models of multiparticle production: a tribute to Peter

    International Nuclear Information System (INIS)

    Cooper, F.

    1996-01-01

    We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations

  12. Purchasing power parity theory in three East Asian economies: New evidence

    OpenAIRE

    Ahmad, Mahyudin; Marwan, Nur Fakhzan

    2012-01-01

    To an otherwise extensive literature with yet mixed findings on the long run Purchasing Power Parity (PPP) theory, this paper extends the evidence against the PPP hypothesis in three East Asian economies namely Indonesia, Malaysia, and Thailand based on quarterly data spanning forty years (1968:Q1-2008:Q1). The testing of PPP hypothesis in this study employs two methods namely Engle-Granger procedure and Johansen multivariate cointegration method.

  13. Studying emotion theories through connectivity analysis: Evidence from generalized psychophysiological interactions and graph theory.

    Science.gov (United States)

    Huang, Yun-An; Jastorff, Jan; Van den Stock, Jan; Van de Vliet, Laura; Dupont, Patrick; Vandenbulcke, Mathieu

    2018-05-15

    Psychological construction models of emotion state that emotions are variable concepts constructed by fundamental psychological processes, whereas according to basic emotion theory, emotions cannot be divided into more fundamental units and each basic emotion is represented by a unique and innate neural circuitry. In a previous study, we found evidence for the psychological construction account by showing that several brain regions were commonly activated when perceiving different emotions (i.e. a general emotion network). Moreover, this set of brain regions included areas associated with core affect, conceptualization and executive control, as predicted by psychological construction models. Here we investigate directed functional brain connectivity in the same dataset to address two questions: 1) is there a common pathway within the general emotion network for the perception of different emotions and 2) if so, does this common pathway contain information to distinguish between different emotions? We used generalized psychophysiological interactions and information flow indices to examine the connectivity within the general emotion network. The results revealed a general emotion pathway that connects neural nodes involved in core affect, conceptualization, language and executive control. Perception of different emotions could not be accurately classified based on the connectivity patterns from the nodes of the general emotion pathway. Successful classification was achieved when connections outside the general emotion pathway were included. We propose that the general emotion pathway functions as a common pathway within the general emotion network and is involved in shared basic psychological processes across emotions. However, additional connections within the general emotion network are required to classify different emotions, consistent with a constructionist account. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. M-Theory Model-Building and Proton Stability

    CERN Document Server

    Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.

    1998-01-01

    We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  15. M-theory model-building and proton stability

    International Nuclear Information System (INIS)

    Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens

    1997-09-01

    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory

  16. Extended Nambu models: Their relation to gauge theories

    Science.gov (United States)

    Escobar, C. A.; Urrutia, L. F.

    2017-05-01

    Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.

  17. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  18. Superstring theory

    International Nuclear Information System (INIS)

    Schwarz, J.H.

    1985-01-01

    Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions

  19. Catastrophe Theory: A Unified Model for Educational Change.

    Science.gov (United States)

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  20. An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories

    International Nuclear Information System (INIS)

    Schiappa, Ricardo; Wyllard, Niclas

    2010-01-01

    We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.

  1. Matrix models as non-commutative field theories on R3

    International Nuclear Information System (INIS)

    Livine, Etera R

    2009-01-01

    In the context of spin foam models for quantum gravity, group field theories are a useful tool allowing on the one hand a non-perturbative formulation of the partition function and on the other hand admitting an interpretation as generalized matrix models. Focusing on 2d group field theories, we review their explicit relation to matrix models and show their link to a class of non-commutative field theories invariant under a quantum-deformed 3d Poincare symmetry. This provides a simple relation between matrix models and non-commutative geometry. Moreover, we review the derivation of effective 2d group field theories with non-trivial propagators from Boulatov's group field theory for 3d quantum gravity. Besides the fact that this gives a simple and direct derivation of non-commutative field theories for the matter dynamics coupled to (3d) quantum gravity, these effective field theories can be expressed as multi-matrix models with a non-trivial coupling between matrices of different sizes. It should be interesting to analyze this new class of theories, both from the point of view of matrix models as integrable systems and for the study of non-commutative field theories.

  2. MODELS AND THE DYNAMICS OF THEORIES

    Directory of Open Access Journals (Sweden)

    Paulo Abrantes

    2007-12-01

    Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.

  3. Brief Report: An Independent Replication and Extension of Psychometric Evidence Supporting the Theory of Mind Inventory

    Science.gov (United States)

    Greenslade, Kathryn J.; Coggins, Truman E.

    2016-01-01

    This study presents an independent replication and extension of psychometric evidence supporting the "Theory of Mind Inventory" ("ToMI"). Parents of 20 children with ASD (4; 1-6; 7 years; months) and 20 with typical development (3; 1-6; 5), rated their child's theory of mind abilities in everyday situations. Other parent report…

  4. Evidence for the epistemic view of quantum states: A toy theory

    International Nuclear Information System (INIS)

    Spekkens, Robert W.

    2007-01-01

    We present a toy theory that is based on a simple principle: the number of questions about the physical state of a system that are answered must always be equal to the number that are unanswered in a state of maximal knowledge. Many quantum phenomena are found to have analogues within this toy theory. These include the noncommutativity of measurements, interference, the multiplicity of convex decompositions of a mixed state, the impossibility of discriminating nonorthogonal states, the impossibility of a universal state inverter, the distinction between bipartite and tripartite entanglement, the monogamy of pure entanglement, no cloning, no broadcasting, remote steering, teleportation, entanglement swapping, dense coding, mutually unbiased bases, and many others. The diversity and quality of these analogies is taken as evidence for the view that quantum states are states of incomplete knowledge rather than states of reality. A consideration of the phenomena that the toy theory fails to reproduce, notably, violations of Bell inequalities and the existence of a Kochen-Specker theorem, provides clues for how to proceed with this research program

  5. Combining morphometric evidence from multiple registration methods using dempster-shafer theory

    Science.gov (United States)

    Rajagopalan, Vidya; Wyatt, Christopher

    2010-03-01

    In tensor-based morphometry (TBM) group-wise differences in brain structure are measured using high degreeof- freedom registration and some form of statistical test. However, it is known that TBM results are sensitive to both the registration method and statistical test used. Given the lack of an objective model of group variation is it difficult to determine a best registration method for TBM. The use of statistical tests is also problematic given the corrections required for multiple testing and the notorius difficulty selecting and intepreting signigance values. This paper presents an approach to address both of these issues by combining multiple registration methods using Dempster-Shafer Evidence theory to produce belief maps of categorical changes between groups. This approach is applied to the comparison brain morphometry in aging, a typical application of TBM, using the determinant of the Jacobian as a measure of volume change. We show that the Dempster-Shafer combination produces a unique and easy to interpret belief map of regional changes between and within groups without the complications associated with hypothesis testing.

  6. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....

  7. Theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena

    Science.gov (United States)

    Jin, Yongmei M.; Wang, Yu U.; Ren, Yang

    2015-12-01

    Pre-martensitic phenomena, also called martensite precursor effects, have been known for decades while yet remain outstanding issues. This paper addresses pre-martensitic phenomena from new theoretical and experimental perspectives. A statistical mechanics-based Grüneisen-type phonon theory is developed. On the basis of deformation-dependent incompletely softened low-energy phonons, the theory predicts a lattice instability and pre-martensitic transition into elastic-phonon domains via 'phonon spinodal decomposition.' The phase transition lifts phonon degeneracy in cubic crystal and has a nature of phonon pseudo-Jahn-Teller lattice instability. The theory and notion of phonon domains consistently explain the ubiquitous pre-martensitic anomalies as natural consequences of incomplete phonon softening. The phonon domains are characterised by broken dynamic symmetry of lattice vibrations and deform through internal phonon relaxation in response to stress (a particular case of Le Chatelier's principle), leading to previously unexplored new domain phenomenon. Experimental evidence of phonon domains is obtained by in situ three-dimensional phonon diffuse scattering and Bragg reflection using high-energy synchrotron X-ray single-crystal diffraction, which observes exotic domain phenomenon fundamentally different from usual ferroelastic domain switching phenomenon. In light of the theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena, currently existing alternative opinions on martensitic precursor phenomena are revisited.

  8. Topological quantum theories and integrable models

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.

    1991-01-01

    The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit

  9. Polyacetylene and relativistic field-theory models

    International Nuclear Information System (INIS)

    Bishop, A.R.; Campbell, D.K.; Fesser, K.

    1981-01-01

    Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed

  10. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  11. Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping

    OpenAIRE

    Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E

    2013-01-01

    Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Interv...

  12. Contemporary Cognitive Behavior Therapy: A Review of Theory, History, and Evidence.

    Science.gov (United States)

    Thoma, Nathan; Pilecki, Brian; McKay, Dean

    2015-09-01

    Cognitive behavior therapy (CBT) has come to be a widely practiced psychotherapy throughout the world. The present article reviews theory, history, and evidence for CBT. It is meant as an effort to summarize the forms and scope of CBT to date for the uninitiated. Elements of CBT such as cognitive therapy, behavior therapy, and so-called "third wave" CBT, such as dialectical behavior therapy (DBT) and acceptance and commitment therapy (ACT) are covered. The evidence for the efficacy of CBT for various disorders is reviewed, including depression, anxiety disorders, personality disorders, eating disorders, substance abuse, schizophrenia, chronic pain, insomnia, and child/adolescent disorders. The relative efficacy of medication and CBT, or their combination, is also briefly considered. Future directions for research and treatment development are proposed.

  13. Application of perturbation theory to sensitivity calculations of PWR type reactor cores using the two-channel model

    International Nuclear Information System (INIS)

    Oliveira, A.C.J.G. de.

    1988-12-01

    Sensitivity calculations are very important in design and safety of nuclear reactor cores. Large codes with a great number of physical considerations have been used to perform sensitivity studies. However, these codes need long computation time involving high costs. The perturbation theory has constituted an efficient and economical method to perform sensitivity analysis. The present work is an application of the perturbation theory (matricial formalism) to a simplified model of DNB (Departure from Nucleate Boiling) analysis to perform sensitivity calculations in PWR cores. Expressions to calculate the sensitivity coefficients of enthalpy and coolant velocity with respect to coolant density and hot channel area were developed from the proposed model. The CASNUR.FOR code to evaluate these sensitivity coefficients was written in Fortran. The comparison between results obtained from the matricial formalism of perturbation theory with those obtained directly from the proposed model makes evident the efficiency and potentiality of this perturbation method for nuclear reactor cores sensitivity calculations (author). 23 refs, 4 figs, 7 tabs

  14. Integrating the context-appropriate balanced attention model and reinforcement sensitivity theory: Towards a domain-general personality process model.

    Science.gov (United States)

    Collins, Michael D; Jackson, Chris J; Walker, Benjamin R; O'Connor, Peter J; Gardiner, Elliroma

    2017-01-01

    Over the last 40 years or more the personality literature has been dominated by trait models based on the Big Five (B5). Trait-based models describe personality at the between-person level but cannot explain the within-person mental mechanisms responsible for personality. Nor can they adequately account for variations in emotion and behavior experienced by individuals across different situations and over time. An alternative, yet understated, approach to personality architecture can be found in neurobiological theories of personality, most notably reinforcement sensitivity theory (RST). In contrast to static trait-based personality models like the B5, RST provides a more plausible basis for a personality process model, namely, one that explains how emotions and behavior arise from the dynamic interaction between contextual factors and within-person mental mechanisms. In this article, the authors review the evolution of a neurobiologically based personality process model based on RST, the response modulation model and the context-appropriate balanced attention model. They argue that by integrating this complex literature, and by incorporating evidence from personality neuroscience, one can meaningfully explain personality at both the within- and between-person levels. This approach achieves a domain-general architecture based on RST and self-regulation that can be used to align within-person mental mechanisms, neurobiological systems and between-person measurement models. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory

    Science.gov (United States)

    Pei, Di; Yue, Jianhai; Jiao, Jing

    2017-10-01

    This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.

  16. Relative Thinking Theory

    OpenAIRE

    Ofer H. Azar

    2005-01-01

    The article presents a theory that I denote “Relative Thinking Theory,” which claims that people consider relative differences and not only absolute differences when making various economics decisions, even in those cases where the rational model dictates that people should consider only absolute differences. The article reviews experimental evidence for this behavior, summarizing briefly several experiments I conducted, as well as some earlier related literature. It then discusses how we can...

  17. Evidence-based selection of theories for designing behaviour change interventions: using methods based on theoretical construct domains to understand clinicians' blood transfusion behaviour.

    Science.gov (United States)

    Francis, Jill J; Stockton, Charlotte; Eccles, Martin P; Johnston, Marie; Cuthbertson, Brian H; Grimshaw, Jeremy M; Hyde, Chris; Tinmouth, Alan; Stanworth, Simon J

    2009-11-01

    Many theories of behaviour are potentially relevant to predictive and intervention studies but most studies investigate a narrow range of theories. Michie et al. (2005) agreed 12 'theoretical domains' from 33 theories that explain behaviour change. They developed a 'Theoretical Domains Interview' (TDI) for identifying relevant domains for specific clinical behaviours, but the framework has not been used for selecting theories for predictive studies. It was used here to investigate clinicians' transfusion behaviour in intensive care units (ICU). Evidence suggests that red blood cells transfusion could be reduced for some patients without reducing quality of care. (1) To identify the domains relevant to transfusion practice in ICUs and neonatal intensive care units (NICUs), using the TDI. (2) To use the identified domains to select appropriate theories for a study predicting transfusion behaviour. An adapted TDI about managing a patient with borderline haemoglobin by watching and waiting instead of transfusing red blood cells was used to conduct semi-structured, one-to-one interviews with 18 intensive care consultants and neonatologists across the UK. Relevant theoretical domains were: knowledge, beliefs about capabilities, beliefs about consequences, social influences, behavioural regulation. Further analysis at the construct level resulted in selection of seven theoretical approaches relevant to this context: Knowledge-Attitude-Behaviour Model, Theory of Planned Behaviour, Social Cognitive Theory, Operant Learning Theory, Control Theory, Normative Model of Work Team Effectiveness and Action Planning Approaches. This study illustrated, the use of the TDI to identify relevant domains in a complex area of inpatient care. This approach is potentially valuable for selecting theories relevant to predictive studies and resulted in greater breadth of potential explanations than would be achieved if a single theoretical model had been adopted.

  18. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  19. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its stru....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its......, it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well...

  20. An Evolutionary Game Theory Model of Spontaneous Brain Functioning.

    Science.gov (United States)

    Madeo, Dario; Talarico, Agostino; Pascual-Leone, Alvaro; Mocenni, Chiara; Santarnecchi, Emiliano

    2017-11-22

    Our brain is a complex system of interconnected regions spontaneously organized into distinct networks. The integration of information between and within these networks is a continuous process that can be observed even when the brain is at rest, i.e. not engaged in any particular task. Moreover, such spontaneous dynamics show predictive value over individual cognitive profile and constitute a potential marker in neurological and psychiatric conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN), able to capture brain's interregional dynamics by balancing emulative and non-emulative attitudes among brain regions. This results in the net behavior of nodes composing resting-state networks identified using functional magnetic resonance imaging (fMRI), determining their moment-to-moment level of activation and inhibition as expressed by positive and negative shifts in BOLD fMRI signal. By spontaneously generating low-frequency oscillatory behaviors, the EGN model is able to mimic functional connectivity dynamics, approximate fMRI time series on the basis of initial subset of available data, as well as simulate the impact of network lesions and provide evidence of compensation mechanisms across networks. Results suggest evolutionary game theory on networks as a new potential framework for the understanding of human brain network dynamics.

  1. Toda theories, W-algebras, and minimal models

    International Nuclear Information System (INIS)

    Mansfield, P.; Spence, B.

    1991-01-01

    We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)

  2. The SMART Theory and Modeling Team: An Integrated Element of Mission Development and Science Analysis

    Science.gov (United States)

    Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.

    2005-01-01

    When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored

  3. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  4. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    Science.gov (United States)

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  5. Contribution to the study of conformal theories and integrable models

    International Nuclear Information System (INIS)

    Sochen, N.

    1992-05-01

    The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved

  6. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  7. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai

    2014-05-01

    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  8. Unresolved issues in theories of autoimmune disease using myocarditis as a framework.

    Science.gov (United States)

    Root-Bernstein, Robert; Fairweather, DeLisa

    2015-06-21

    Many theories of autoimmune disease have been proposed since the discovery that the immune system can attack the body. These theories include the hidden or cryptic antigen theory, modified antigen theory, T cell bypass, T cell-B cell mismatch, epitope spread or drift, the bystander effect, molecular mimicry, anti-idiotype theory, antigenic complementarity, and dual-affinity T cell receptors. We critically review these theories and relevant mathematical models as they apply to autoimmune myocarditis. All theories share the common assumption that autoimmune diseases are triggered by environmental factors such as infections or chemical exposure. Most, but not all, theories and mathematical models are unifactorial assuming single-agent causation of disease. Experimental and clinical evidence and mathematical models exist to support some aspects of most theories, but evidence/models that support one theory almost invariably supports other theories as well. More importantly, every theory (and every model) lacks the ability to account for some key autoimmune disease phenomena such as the fundamental roles of innate immunity, sex differences in disease susceptibility, the necessity for adjuvants in experimental animal models, and the often paradoxical effect of exposure timing and dose on disease induction. We argue that a more comprehensive and integrated theory of autoimmunity associated with new mathematical models is needed and suggest specific experimental and clinical tests for each major theory that might help to clarify how they relate to clinical disease and reveal how theories are related. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Unresolved issues in theories of autoimmune disease using myocarditis as a framework

    Science.gov (United States)

    Root-Bernstein, Robert; Fairweather, DeLisa

    2014-01-01

    Many theories of autoimmune disease have been proposed since the discovery that the immune system can attack the body. These theories include the hidden or cryptic antigen theory, modified antigen theory, T cell bypass, T cell-B cell mismatch, epitope spread or drift, the bystander effect, molecular mimicry, anti-idiotype theory, antigenic complementarity, and dual-affinity T cell receptors. We critically review these theories and relevant mathematical models as they apply to autoimmune myocarditis. All theories share the common assumption that autoimmune diseases are triggered by environmental factors such as infections or chemical exposure. Most, but not all, theories and mathematical models are unifactorial assuming single-agent causation of disease. Experimental and clinical evidence and mathematical models exist to support some aspects of most theories, but evidence/models that support one theory almost invariably supports other theories as well. More importantly, every theory (and every model) lacks the ability to account for some key autoimmune disease phenomena such as the fundamental roles of innate immunity, sex differences in disease susceptibility, the necessity for adjuvants in experimental animal models, and the often paradoxical effect of exposure timing and dose on disease induction. We argue that a more comprehensive and integrated theory of autoimmunity associated with new mathematical models is needed and suggest specific experimental and clinical tests for each major theory that might help to clarify how they relate to clinical disease and reveal how theories are related. PMID:25484004

  10. Staircase Models from Affine Toda Field Theory

    CERN Document Server

    Dorey, P; Dorey, Patrick; Ravanini, Francesco

    1993-01-01

    We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.

  11. Using Evidence Credibility Decay Model for dependence assessment in human reliability analysis

    International Nuclear Information System (INIS)

    Guo, Xingfeng; Zhou, Yanhui; Qian, Jin; Deng, Yong

    2017-01-01

    Highlights: • A new computational model is proposed for dependence assessment in HRA. • We combined three factors of “CT”, “TR” and “SP” within Dempster–Shafer theory. • The BBA of “SP” is reconstructed by discounting rate based on the ECDM. • Simulation experiments are illustrated to show the efficiency of the proposed method. - Abstract: Dependence assessment among human errors plays an important role in human reliability analysis. When dependence between two sequent tasks exists in human reliability analysis, if the preceding task fails, the failure probability of the following task is higher than success. Typically, three major factors are considered: “Closeness in Time” (CT), “Task Relatedness” (TR) and “Similarity of Performers” (SP). Assume TR is not changed, both SP and CT influence the degree of dependence level and SP is discounted by the time as the result of combine two factors in this paper. In this paper, a new computational model is proposed based on the Dempster–Shafer Evidence Theory (DSET) and Evidence Credibility Decay Model (ECDM) to assess the dependence between tasks in human reliability analysis. First, the influenced factors among human tasks are identified and the basic belief assignments (BBAs) of each factor are constructed based on expert evaluation. Then, the BBA of SP is discounted as the result of combining two factors and reconstructed by using the ECDM, the factors are integrated into a fused BBA. Finally, the dependence level is calculated based on fused BBA. Experimental results demonstrate that the proposed model not only quantitatively describe the fact that the input factors influence the dependence level, but also exactly show how the dependence level regular changes with different situations of input factors.

  12. Analytical Model for the End-Bearing Capacity of Tapered Piles Using Cavity Expansion Theory

    Directory of Open Access Journals (Sweden)

    Suman Manandhar

    2012-01-01

    Full Text Available On the basis of evidence from model tests on increasing the end-bearing behavior of tapered piles at the load-settlement curve, this paper proposes an analytical spherical cavity expansion theory to evaluate the end-bearing capacity. The angle of tapering is inserted in the proposed model to evaluate the end-bearing capacity. The test results of the proposed model in different types of sands and different relative densities show good effects compared to conventional straight piles. The end-bearing capacity increases with increases in the tapering angle. The paper then propounds a model for prototypes and real-type pile tests which predicts and validates to evaluate the end-bearing capacity.

  13. Irreducible integrable theories form tensor products of conformal models

    International Nuclear Information System (INIS)

    Mathur, S.D.; Warner, N.P.

    1991-01-01

    By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)

  14. Evidence-Based Theory of Market Manipulation And Application: The Malaysian Case

    OpenAIRE

    Heong, Yin Yun

    2010-01-01

    According to Part IX Division 1 in Securities Industry Act 1983 of Malaysia Law, stock market manipulation is defined as unlawful action taken either direct or indirectly by any person, to affect the price of securities of the corporation on a stock market in Malaysia for the purpose which may include the purpose of inducing other persons. Extending the framework of Allen and Gale (1992), the Author presents a theory based on the empirical evidence from prosecuted stock market manipulation ca...

  15. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto

    2008-01-01

    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  16. Models and theories of prescribing decisions: A review and suggested a new model.

    Science.gov (United States)

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  17. Standard model and chiral gauge theories on the lattice

    International Nuclear Information System (INIS)

    Smit, J.

    1990-01-01

    A review is given of developments in lattice formulations of chiral gauge theories. There is now evidence that the unwanted fermion doublers can be decoupled satisfactorily by giving them masses of the order of the cutoff. (orig.)

  18. Collectivism and coping: current theories, evidence, and measurements of collective coping.

    Science.gov (United States)

    Kuo, Ben C H

    2013-01-01

    A burgeoning body of cultural coping research has begun to identify the prevalence and the functional importance of collective coping behaviors among culturally diverse populations in North America and internationally. These emerging findings are highly significant as they evidence culture's impacts on the stress-coping process via collectivistic values and orientation. They provide a critical counterpoint to the prevailing Western, individualistic stress and coping paradigm. However, current research and understanding about collective coping appear to be piecemeal and not well integrated. To address this issue, this review attempts to comprehensively survey, summarize, and evaluate existing research related to collective coping and its implications for coping research with culturally diverse populations from multiple domains. Specifically, this paper reviews relevant research and knowledge on collective coping in terms of: (a) operational definitions; (b) theories; (c) empirical evidence based on studies of specific cultural groups and broad cultural values/dimensions; (d) measurements; and (e) implications for future cultural coping research. Overall, collective coping behaviors are conceived as a product of the communal/relational norms and values of a cultural group across studies. They also encompass a wide array of stress responses ranging from value-driven to interpersonally based to culturally conditioned emotional/cognitive to religion- and spirituality-grounded coping strategies. In addition, this review highlights: (a) the relevance and the potential of cultural coping theories to guide future collective coping research; (b) growing evidence for the prominence of collective coping behaviors particularly among Asian nationals, Asian Americans/Canadians and African Americans/Canadians; (c) preference for collective coping behaviors as a function of collectivism and interdependent cultural value and orientation; and (d) six cultural coping scales. This

  19. Development of a dynamic computational model of social cognitive theory.

    Science.gov (United States)

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  20. Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping

    Science.gov (United States)

    Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E

    2014-01-01

    Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. PMID:24811880

  1. Comparing theories' performance in predicting violence.

    Science.gov (United States)

    Haas, Henriette; Cusson, Maurice

    2015-01-01

    The stakes of choosing the best theory as a basis for violence prevention and offender rehabilitation are high. However, no single theory of violence has ever been universally accepted by a majority of established researchers. Psychiatry, psychology and sociology are each subdivided into different schools relying upon different premises. All theories can produce empirical evidence for their validity, some of them stating the opposite of each other. Calculating different models with multivariate logistic regression on a dataset of N = 21,312 observations and ninety-two influences allowed a direct comparison of the performance of operationalizations of some of the most important schools. The psychopathology model ranked as the best model in terms of predicting violence right after the comprehensive interdisciplinary model. Next came the rational choice and lifestyle model and third the differential association and learning theory model. Other models namely the control theory model, the childhood-trauma model and the social conflict and reaction model turned out to have low sensitivities for predicting violence. Nevertheless, all models produced acceptable results in predictions of a non-violent outcome. Copyright © 2015. Published by Elsevier Ltd.

  2. Sworn testimony of the model evidence: Gaussian Mixture Importance (GAME) sampling

    Science.gov (United States)

    Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.

    2017-07-01

    What is the "best" model? The answer to this question lies in part in the eyes of the beholder, nevertheless a good model must blend rigorous theory with redeeming qualities such as parsimony and quality of fit. Model selection is used to make inferences, via weighted averaging, from a set of K candidate models, Mk; k=>(1,…,K>), and help identify which model is most supported by the observed data, Y>˜=>(y˜1,…,y˜n>). Here, we introduce a new and robust estimator of the model evidence, p>(Y>˜|Mk>), which acts as normalizing constant in the denominator of Bayes' theorem and provides a single quantitative measure of relative support for each hypothesis that integrates model accuracy, uncertainty, and complexity. However, p>(Y>˜|Mk>) is analytically intractable for most practical modeling problems. Our method, coined GAussian Mixture importancE (GAME) sampling, uses bridge sampling of a mixture distribution fitted to samples of the posterior model parameter distribution derived from MCMC simulation. We benchmark the accuracy and reliability of GAME sampling by application to a diverse set of multivariate target distributions (up to 100 dimensions) with known values of p>(Y>˜|Mk>) and to hypothesis testing using numerical modeling of the rainfall-runoff transformation of the Leaf River watershed in Mississippi, USA. These case studies demonstrate that GAME sampling provides robust and unbiased estimates of the evidence at a relatively small computational cost outperforming commonly used estimators. The GAME sampler is implemented in the MATLAB package of DREAM and simplifies considerably scientific inquiry through hypothesis testing and model selection.

  3. Extensions to a nonlinear finite-element axisymmetric shell model based on Reissner's shell theory

    International Nuclear Information System (INIS)

    Cook, W.A.

    1981-01-01

    Extensions to shell analysis not usually associated with shell theory are described in this paper. These extensions involve thick shells, nonlinear materials, a linear normal stress approximation, and a changing shell thickness. A finite element shell-of-revolution model has been developed to analyze nuclear material shipping containers under severe impact conditions. To establish the limits for this shell model, the basic assumptions used in its development were studied; these are listed in this paper. Several extensions were evident from the study of these limits: a thick shell, a plastic hinge, and a linear normal stress

  4. Comparing theories' performance in predicting violence

    OpenAIRE

    Haas, Henriette; Cusson, Maurice

    2015-01-01

    The stakes of choosing the best theory as a basis for violence prevention and offender rehabilitation are high. However, no single theory of violence has ever been universally accepted by a majority of established researchers. Psychiatry, psychology and sociology are each subdivided into different schools relying upon different premises. All theories can produce empirical evidence for their validity, some of them stating the opposite of each other. Calculating different models wit...

  5. Theory and modeling group

    Science.gov (United States)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  6. A Reflection on Research, Theory, Evidence-based Practice, and Quality Improvement

    Directory of Open Access Journals (Sweden)

    Eesa Mohammadi

    2016-04-01

    While each process is associated with its unique characteristics, overlaps are likely to appear between each of the two processes. For instance, in the EBP process, if one discovers (theory that evidence is inadequate to implement a certain intervention, it highlights the need for research on that specific subject. Similarly, QI may lead to the identification of new questions, which could be used for research purposes. All the discussed processes, as well as their scientific and professional dimensions, are essential to nursing disciplines in healthcare systems.

  7. A 'theory of everything'? [Extending the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1993-01-01

    The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)

  8. Introduction to gauge theories and the Standard Model

    CERN Document Server

    de Wit, Bernard

    1995-01-01

    The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.

  9. Effective field theory and the quark model

    International Nuclear Information System (INIS)

    Durand, Loyal; Ha, Phuoc; Jaczko, Gregory

    2001-01-01

    We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections

  10. Minisuperspace models in histories theory

    International Nuclear Information System (INIS)

    Anastopoulos, Charis; Savvidou, Ntina

    2005-01-01

    We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context

  11. Quantum Link Models and Quantum Simulation of Gauge Theories

    International Nuclear Information System (INIS)

    Wiese, U.J.

    2015-01-01

    This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)

  12. Matrix model as a mirror of Chern-Simons theory

    International Nuclear Information System (INIS)

    Aganagic, Mina; Klemm, Albrecht; Marino, Marcos; Vafa, Cumrun

    2004-01-01

    Using mirror symmetry, we show that Chern-Simons theory on certain manifolds such as lens spaces reduces to a novel class of Hermitian matrix models, where the measure is that of unitary matrix models. We show that this agrees with the more conventional canonical quantization of Chern-Simons theory. Moreover, large N dualities in this context lead to computation of all genus A-model topological amplitudes on toric Calabi-Yau manifolds in terms of matrix integrals. In the context of type IIA superstring compactifications on these Calabi-Yau manifolds with wrapped D6 branes (which are dual to M-theory on G2 manifolds) this leads to engineering and solving F-terms for N=1 supersymmetric gauge theories with superpotentials involving certain multi-trace operators. (author)

  13. A QCD Model Using Generalized Yang-Mills Theory

    International Nuclear Information System (INIS)

    Wang Dianfu; Song Heshan; Kou Lina

    2007-01-01

    Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.

  14. Item level diagnostics and model - data fit in item response theory ...

    African Journals Online (AJOL)

    Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...

  15. Restoring primacy in amnesic free recall: evidence for the recency theory of primacy.

    Science.gov (United States)

    Dewar, Michaela; Brown, Gordon D A; Della Sala, Sergio

    2011-09-01

    Primacy and recency effects at immediate recall are thought to reflect the independent functioning of a long-term memory store (primacy) and a short-term memory store (recency). Key evidence for this theory comes from amnesic patients who show severe long-term memory storage deficits, coupled with profoundly attenuated primacy. Here we challenge this dominant dual-store theory of immediate recall by demonstrating that attenuated primacy in amnesic patients can reflect abnormal working memory rehearsal processes. D.A., a patient with severe amnesia, presented with profoundly attenuated primacy when using her preferred atypical noncumulative rehearsal strategy. In contrast, despite her severe amnesia, she showed normal primacy when her rehearsal was matched with that of controls via an externalized cumulative rehearsal schedule. Our data are in keeping with the "recency theory of primacy" and suggest that primacy at immediate recall is dependent upon medial temporal lobe involvement in cumulative rehearsal rather than long-term memory storage.

  16. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....

  17. Reconstructing bidimensional scalar field theory models

    International Nuclear Information System (INIS)

    Flores, Gabriel H.; Svaiter, N.F.

    2001-07-01

    In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)

  18. Two-matrix models and c =1 string theory

    International Nuclear Information System (INIS)

    Bonora, L.; Xiong Chuansheng

    1994-05-01

    We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)

  19. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  20. Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory

    International Nuclear Information System (INIS)

    Chung, S.; Tye, S.H.

    1993-01-01

    The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory

  1. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  2. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ingale, S. V.; Datta, D.

    2010-01-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  3. Jonas Olson's Evidence for Moral Error Theory

    NARCIS (Netherlands)

    Evers, Daan

    2016-01-01

    Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although

  4. Ottawa Model of Implementation Leadership and Implementation Leadership Scale: mapping concepts for developing and evaluating theory-based leadership interventions.

    Science.gov (United States)

    Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A

    2017-01-01

    Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models.

  5. Ottawa Model of Implementation Leadership and Implementation Leadership Scale: mapping concepts for developing and evaluating theory-based leadership interventions

    Science.gov (United States)

    Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A

    2017-01-01

    Purpose Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Methods Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. Results All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. Conclusion The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models. PMID:29355212

  6. DO TANZANIAN COMPANIES PRACTICE PECKING ORDER THEORY, AGENCY COST THEORY OR TRADE-OFF THEORY? AN EMPIRICAL STUDY IN TANZANIAN LISTED COMPANIES

    Directory of Open Access Journals (Sweden)

    Ntogwa Ng'habi Bundala

    2012-01-01

    Full Text Available The empirical study was focused predominantly on validity tests of the three theories on capital structures, the static trade-off theory, the pecking order theory (information asymmetry theory, and agency cost theory in the Tanzanian context. The study used secondary data from eight of the non-financial companies listed in Dar Es Salaam Stock Exchange (DSE from 2006-2012. The study used descriptive (quantitative approach to test the practicality of the theories in Tanzania. The multiple regressions model used to test the theoretical relationship between the financial leverage and characteristics of the company. The research found that there is no strong evidence for validation of static trade off theory, little support of pecking order theory, but the agency cost theory is confirmed to be valid and practiced in Tanzania. It recommended that Tanzanian companies should be adhering to the determinants of the capital structure in the Tanzanian context found by this study.

  7. Crisis in Context Theory: An Ecological Model

    Science.gov (United States)

    Myer, Rick A.; Moore, Holly B.

    2006-01-01

    This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…

  8. Magnetic confinement theory summary

    International Nuclear Information System (INIS)

    Connor, J.W.

    2005-01-01

    A total of 93 papers under the theory, TH, heading were presented at the conference, although a number of experimental papers also contained significant theory elements: only the former are reviewed here. A novel development was the inclusion of a Theory Overview paper, presented by P H Diamond, on the subject of zonal flows, currently a topic of great interest to the fusion community. The remainder of the theory papers were distributed amongst oral presentations (32, with 11 rapporteured and one a post-deadline submission) and 58 posters, one of which was post-deadline. A number of themes, or trends, are evident, all springing from the growing use of numerical approaches to plasma theory. These are: (i) the use of direct numerical simulations to calculate and provide insights into turbulent transport (indeed there were about 30 papers with contributions on this topic), although analytic modelling plays a role in interpreting these 'numerical experiments'; (ii) increasing realism in modelling of geometry and physics in areas such as macroscopic MHD phenomena and radio-frequency heating and current drive, both of which involve modelling of fast-particle distributions; and (iii) a growing emphasis on integrated modelling, bringing together modules that describe interacting aspects of plasma behaviour

  9. Health decision making: lynchpin of evidence-based practice.

    Science.gov (United States)

    Spring, Bonnie

    2008-01-01

    Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers' intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed.

  10. Evidence of quantum phase transition in real-space vacuum entanglement of higher derivative scalar quantum field theories.

    Science.gov (United States)

    Kumar, S Santhosh; Shankaranarayanan, S

    2017-11-17

    In a bipartite set-up, the vacuum state of a free Bosonic scalar field is entangled in real space and satisfies the area-law- entanglement entropy scales linearly with area of the boundary between the two partitions. In this work, we show that the area law is violated in two spatial dimensional model Hamiltonian having dynamical critical exponent z = 3. The model physically corresponds to next-to-next-to-next nearest neighbour coupling terms on a lattice. The result reported here is the first of its kind of violation of area law in Bosonic systems in higher dimensions and signals the evidence of a quantum phase transition. We provide evidence for quantum phase transition both numerically and analytically using quantum Information tools like entanglement spectra, quantum fidelity, and gap in the energy spectra. We identify the cause for this transition due to the accumulation of large number of angular zero modes around the critical point which catalyses the change in the ground state wave function due to the next-to-next-to-next nearest neighbor coupling. Lastly, using Hubbard-Stratanovich transformation, we show that the effective Bosonic Hamiltonian can be obtained from an interacting fermionic theory and provide possible implications for condensed matter systems.

  11. Rewards of bridging the divide between measurement and clinical theory: demonstration of a bifactor model for the Brief Symptom Inventory.

    Science.gov (United States)

    Thomas, Michael L

    2012-03-01

    There is growing evidence that psychiatric disorders maintain hierarchical associations where general and domain-specific factors play prominent roles (see D. Watson, 2005). Standard, unidimensional measurement models can fail to capture the meaningful nuances of such complex latent variable structures. The present study examined the ability of the multidimensional item response theory bifactor model (see R. D. Gibbons & D. R. Hedeker, 1992) to improve construct validity by serving as a bridge between measurement and clinical theories. Archival data consisting of 688 outpatients' psychiatric diagnoses and item-level responses to the Brief Symptom Inventory (BSI; L. R. Derogatis, 1993) were extracted from files at a university mental health clinic. The bifactor model demonstrated superior fit for the internal structure of the BSI and improved overall diagnostic accuracy in the sample (73%) compared with unidimensional (61%) and oblique simple structure (65%) models. Consistent with clinical theory, multiple sources of item variance were drawn from individual test items. Test developers and clinical researchers are encouraged to consider model-based measurement in the assessment of psychiatric distress.

  12. Family Change and Gender Differences: Implications for Theory and Practice.

    Science.gov (United States)

    Hare-Mustin, Rachel T.

    1988-01-01

    Examines theories of gender differences. Discusses alpha bias, exaggeration of gender opposition, as characteristic of psychodynamic and sex role theories; and beta bias, denial of gender differences, as evident in systems theories. Calls for new model of gender differences which recognizes asymmetry in women's and men's roles and…

  13. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  14. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....

  15. Integrable theories that are asymptotically CFT

    CERN Document Server

    Evans, J M; Jonathan M Evans; Timothy J Hollowood

    1995-01-01

    A series of sigma models with torsion are analysed which generate their mass dynamically but whose ultra-violet fixed points are non-trivial conformal field theories -- in fact SU(2) WZW models at level k. In contrast to the more familiar situation of asymptotically free theories in which the fixed points are trivial, the sigma models considered here may be termed ``asymptotically CFT''. These theories have previously been conjectured to be quantum integrable; we confirm this by proposing a factorizable S-matrix to describe their infra-red behaviour and then carrying out a stringent test of this proposal. The test involves coupling the theory to a conserved charge and evaluating the response of the free-energy both in perturbation theory to one loop and directly from the S-matrix via the Thermodynamic Bethe Ansatz with a chemical potential at zero temperature. Comparison of these results provides convincing evidence in favour of the proposed S-matrix; it also yields the universal coefficients of the beta-func...

  16. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  17. Minimal string theories and integrable hierarchies

    Science.gov (United States)

    Iyer, Ramakrishnan

    Well-defined, non-perturbative formulations of the physics of string theories in specific minimal or superminimal model backgrounds can be obtained by solving matrix models in the double scaling limit. They provide us with the first examples of completely solvable string theories. Despite being relatively simple compared to higher dimensional critical string theories, they furnish non-perturbative descriptions of interesting physical phenomena such as geometrical transitions between D-branes and fluxes, tachyon condensation and holography. The physics of these theories in the minimal model backgrounds is succinctly encoded in a non-linear differential equation known as the string equation, along with an associated hierarchy of integrable partial differential equations (PDEs). The bosonic string in (2,2m-1) conformal minimal model backgrounds and the type 0A string in (2,4 m) superconformal minimal model backgrounds have the Korteweg-de Vries system, while type 0B in (2,4m) backgrounds has the Zakharov-Shabat system. The integrable PDE hierarchy governs flows between backgrounds with different m. In this thesis, we explore this interesting connection between minimal string theories and integrable hierarchies further. We uncover the remarkable role that an infinite hierarchy of non-linear differential equations plays in organizing and connecting certain minimal string theories non-perturbatively. We are able to embed the type 0A and 0B (A,A) minimal string theories into this single framework. The string theories arise as special limits of a rich system of equations underpinned by an integrable system known as the dispersive water wave hierarchy. We find that there are several other string-like limits of the system, and conjecture that some of them are type IIA and IIB (A,D) minimal string backgrounds. We explain how these and several other string-like special points arise and are connected. In some cases, the framework endows the theories with a non

  18. A Leadership Identity Development Model: Applications from a Grounded Theory

    Science.gov (United States)

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.

    2006-01-01

    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  19. A review of organizational buyer behaviour models and theories ...

    African Journals Online (AJOL)

    Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...

  20. Random matrix theory and higher genus integrability: the quantum chiral Potts model

    International Nuclear Information System (INIS)

    Angles d'Auriac, J.Ch.; Maillard, J.M.; Viallet, C.M.

    2002-01-01

    We perform a random matrix theory (RMT) analysis of the quantum four-state chiral Potts chain for different sizes of the chain up to size L 8. Our analysis gives clear evidence of a Gaussian orthogonal ensemble (GOE) statistics, suggesting the existence of a generalized time-reversal invariance. Furthermore, a change from the (generic) GOE distribution to a Poisson distribution occurs when the integrability conditions are met. The chiral Potts model is known to correspond to a (star-triangle) integrability associated with curves of genus higher than zero or one. Therefore, the RMT analysis can also be seen as a detector of 'higher genus integrability'. (author)

  1. Topos models for physics and topos theory

    International Nuclear Information System (INIS)

    Wolters, Sander

    2014-01-01

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos

  2. Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping.

    Science.gov (United States)

    Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E

    2014-04-01

    Fear arousal-vividly showing people the negative health consequences of life-endangering behaviors-is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. © 2013 The Authors. International Journal of Psychology published by John Wiley © Sons Ltd on behalf of International Union of Psychological Science.

  3. Finite Unification: Theory, Models and Predictions

    CERN Document Server

    Heinemeyer, S; Zoupanos, G

    2011-01-01

    All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...

  4. Scattering and short-distance properties in field theory models

    International Nuclear Information System (INIS)

    Iagolnitzer, D.

    1987-01-01

    The aim of constructive field theory is not only to define models but also to establish their general properties of physical interest. We here review recent works on scattering and on short-distance properties for weakly coupled theories with mass gap such as typically P(φ) in dimension 2, φ 4 in dimension 3 and the (renormalizable, asymptotically free) massive Gross-Neveu (GN) model in dimension 2. Many of the ideas would apply similarly to other (possibly non renormalizable) theories that might be defined in a similar way via phase-space analysis

  5. Statistical test theory for the behavioral sciences

    CERN Document Server

    de Gruijter, Dato N M

    2007-01-01

    Since the development of the first intelligence test in the early 20th century, educational and psychological tests have become important measurement techniques to quantify human behavior. Focusing on this ubiquitous yet fruitful area of research, Statistical Test Theory for the Behavioral Sciences provides both a broad overview and a critical survey of assorted testing theories and models used in psychology, education, and other behavioral science fields. Following a logical progression from basic concepts to more advanced topics, the book first explains classical test theory, covering true score, measurement error, and reliability. It then presents generalizability theory, which provides a framework to deal with various aspects of test scores. In addition, the authors discuss the concept of validity in testing, offering a strategy for evidence-based validity. In the two chapters devoted to item response theory (IRT), the book explores item response models, such as the Rasch model, and applications, incl...

  6. Anisotropic cosmological models in f (R, T) theory of gravitation

    Indian Academy of Sciences (India)

    indirect evidence for the late time accelerated expansion of the Universe. ... Bertolami et al [9] proposed a generalization of f (R) theory of gravity ..... For the purpose of reference, we set the origin of the time coordinate at the bounce of.

  7. Foundations of compositional model theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2011-01-01

    Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf

  8. Robust global identifiability theory using potentials--Application to compartmental models.

    Science.gov (United States)

    Wongvanich, N; Hann, C E; Sirisena, H R

    2015-04-01

    This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Firm Size, a Self-Organized Critical Phenomenon: Evidence from the Dynamical Systems Theory

    Science.gov (United States)

    Chandra, Akhilesh

    This research draws upon a recent innovation in the dynamical systems literature called the theory of self -organized criticality (SOC) (Bak, Tang, and Wiesenfeld 1988) to develop a computational model of a firm's size by relating its internal and the external sub-systems. As a holistic paradigm, the theory of SOC implies that a firm as a composite system of many degrees of freedom naturally evolves to a critical state in which a minor event starts a chain reaction that can affect either a part or the system as a whole. Thus, the global features of a firm cannot be understood by analyzing its individual parts separately. The causal framework builds upon a constant capital resource to support a volume of production at the existing level of efficiency. The critical size is defined as the production level at which the average product of a firm's factors of production attains its maximum value. The non -linearity is inferred by a change in the nature of relations at the border of criticality, between size and the two performance variables, viz., the operating efficiency and the financial efficiency. The effect of breaching the critical size is examined on the stock price reactions. Consistent with the theory of SOC, it is hypothesized that the temporal response of a firm breaching the level of critical size should behave as a flicker noise (1/f) process. The flicker noise is characterized by correlations extended over a wide range of time scales, indicating some sort of cooperative effect among a firm's degrees of freedom. It is further hypothesized that a firm's size evolves to a spatial structure with scale-invariant, self-similar (fractal) properties. The system is said to be self-organized inasmuch as it naturally evolves to the state of criticality without any detailed specifications of the initial conditions. In this respect, the critical state is an attractor of the firm's dynamics. Another set of hypotheses examines the relations between the size and the

  10. Theories of conduct disorder: a causal modelling analysis

    NARCIS (Netherlands)

    Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De

    2004-01-01

    Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –

  11. Modelling non-ignorable missing data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cornelis A.W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  12. Modelling non-ignorable missing-data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cees A. W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  13. Nematic elastomers: from a microscopic model to macroscopic elasticity theory.

    Science.gov (United States)

    Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette

    2008-05-01

    A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.

  14. Alternative banking: theory and evidence from Europe

    Directory of Open Access Journals (Sweden)

    Kurt Von Mettenheim

    2012-12-01

    Full Text Available Since financial liberalization in the 1980s, non-profit maximizing, stakeholder-oriented banks have outperformed private banks in Europe. This article draws on empirical research, banking theory and theories of the firm to explain this apparent anomaly for neo-liberal policy and contemporary market-based banking theory. The realization of competitive advantages by alternative banks (savings banks, cooperative banks and development banks has significant implications for conceptions of bank change, regulation and political economy.

  15. Spatial interaction models facility location using game theory

    CERN Document Server

    D'Amato, Egidio; Pardalos, Panos

    2017-01-01

    Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.

  16. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    Science.gov (United States)

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  18. Disrupted cortical connectivity theory as an explanatory model for autism spectrum disorders

    Science.gov (United States)

    Kana, Rajesh K.; Libero, Lauren E.; Moore, Marie S.

    2011-12-01

    Recent findings of neurological functioning in autism spectrum disorder (ASD) point to altered brain connectivity as a key feature of its pathophysiology. The cortical underconnectivity theory of ASD (Just et al., 2004) provides an integrated framework for addressing these new findings. This theory suggests that weaker functional connections among brain areas in those with ASD hamper their ability to accomplish complex cognitive and social tasks successfully. We will discuss this theory, but will modify the term underconnectivity to ‘disrupted cortical connectivity’ to capture patterns of both under- and over-connectivity in the brain. In this paper, we will review the existing literature on ASD to marshal supporting evidence for hypotheses formulated on the disrupted cortical connectivity theory. These hypotheses are: 1) underconnectivity in ASD is manifested mainly in long-distance cortical as well as subcortical connections rather than in short-distance cortical connections; 2) underconnectivity in ASD is manifested only in complex cognitive and social functions and not in low-level sensory and perceptual tasks; 3) functional underconnectivity in ASD may be the result of underlying anatomical abnormalities, such as problems in the integrity of white matter; 4) the ASD brain adapts to underconnectivity through compensatory strategies such as overconnectivity mainly in frontal and in posterior brain areas. This may be manifested as deficits in tasks that require frontal-parietal integration. While overconnectivity can be tested by examining the cortical minicolumn organization, long-distance underconnectivity can be tested by cognitively demanding tasks; and 5) functional underconnectivity in brain areas in ASD will be seen not only during complex tasks but also during task-free resting states. We will also discuss some empirical predictions that can be tested in future studies, such as: 1) how disrupted connectivity relates to cognitive impairments in skills

  19. Developing Theory to Guide Building Practitioners' Capacity to Implement Evidence-Based Interventions.

    Science.gov (United States)

    Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C; Escoffery, Cam T; Herrmann, Alison K; Thatcher, Esther; Hartman, Marieke A; Fernandez, Maria E

    2017-02-01

    Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners' capacity to adopt and implement a variety of EBIs across diverse practice contexts.

  20. An anthology of theories and models of design philosophy, approaches and empirical explorations

    CERN Document Server

    Blessing, Lucienne

    2014-01-01

    While investigations into both theories and models has remained a major strand of engineering design research, current literature sorely lacks a reference book that provides a comprehensive and up-to-date anthology of theories and models, and their philosophical and empirical underpinnings; An Anthology of Theories and Models of Design fills this gap. The text collects the expert views of an international authorship, covering: ·         significant theories in engineering design, including CK theory, domain theory, and the theory of technical systems; ·         current models of design, from a function behavior structure model to an integrated model; ·         important empirical research findings from studies into design; and ·         philosophical underpinnings of design itself. For educators and researchers in engineering design, An Anthology of Theories and Models of Design gives access to in-depth coverage of theoretical and empirical developments in this area; for pr...

  1. A Multidimensional Theory of Suicide.

    Science.gov (United States)

    Leenaars, Antoon A; Dieserud, Gudrun; Wenckstern, Susanne; Dyregrov, Kari; Lester, David; Lyke, Jennifer

    2018-04-05

    Theory is the foundation of science; this is true in suicidology. Over decades of studies of suicide notes, Leenaars developed a multidimensional model of suicide, with international (crosscultural) studies and independent verification. To corroborate Leenaars's theory with a psychological autopsy (PA) study, examining age and sex of the decedent, and survivor's relationship to deceased. A PA study in Norway, with 120 survivors/informants was undertaken. Leenaars' theoretical-conceptual (protocol) analysis was undertaken of the survivors' narratives and in-depth interviews combined. Substantial interjudge reliability was noted (κ = .632). Overall, there was considerable confirmatory evidence of Leenaars's intrapsychic and interpersonal factors in suicide survivors' narratives. Differences were found in the age of the decedent, but not in sex, nor in the survivor's closeness of the relationship. Older deceased people were perceived to exhibit more heightened unbearable intrapsychic pain, associated with the suicide. Leenaars's theory has corroborative verification, through the decedents' suicide notes and the survivors' narratives. However, the multidimensional model needs further testing to develop a better evidence-based way of understanding suicide.

  2. Spin foam model for pure gauge theory coupled to quantum gravity

    International Nuclear Information System (INIS)

    Oriti, Daniele; Pfeiffer, Hendryk

    2002-01-01

    We propose a spin foam model for pure gauge fields coupled to Riemannian quantum gravity in four dimensions. The model is formulated for the triangulation of a four-manifold which is given merely combinatorially. The Riemannian Barrett-Crane model provides the gravity sector of our model and dynamically assigns geometric data to the given combinatorial triangulation. The gauge theory sector is a lattice gauge theory living on the same triangulation and obtains from the gravity sector the geometric information which is required to calculate the Yang-Mills action. The model is designed so that one obtains a continuum approximation of the gauge theory sector at an effective level, similarly to the continuum limit of lattice gauge theory, when the typical length scale of gravity is much smaller than the Yang-Mills scale

  3. A model of PCF in guarded type theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  4. A Model of PCF in Guarded Type Theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  5. A trust evaluation algorithm for wireless sensor networks based on node behaviors and D-S evidence theory.

    Science.gov (United States)

    Feng, Renjian; Xu, Xiaofeng; Zhou, Xiang; Wan, Jiangwen

    2011-01-01

    For wireless sensor networks (WSNs), many factors, such as mutual interference of wireless links, battlefield applications and nodes exposed to the environment without good physical protection, result in the sensor nodes being more vulnerable to be attacked and compromised. In order to address this network security problem, a novel trust evaluation algorithm defined as NBBTE (Node Behavioral Strategies Banding Belief Theory of the Trust Evaluation Algorithm) is proposed, which integrates the approach of nodes behavioral strategies and modified evidence theory. According to the behaviors of sensor nodes, a variety of trust factors and coefficients related to the network application are established to obtain direct and indirect trust values through calculating weighted average of trust factors. Meanwhile, the fuzzy set method is applied to form the basic input vector of evidence. On this basis, the evidence difference is calculated between the indirect and direct trust values, which link the revised D-S evidence combination rule to finally synthesize integrated trust value of nodes. The simulation results show that NBBTE can effectively identify malicious nodes and reflects the characteristic of trust value that 'hard to acquire and easy to lose'. Furthermore, it is obvious that the proposed scheme has an outstanding advantage in terms of illustrating the real contribution of different nodes to trust evaluation.

  6. Game Theory and its Relationship with Linear Programming Models ...

    African Journals Online (AJOL)

    Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.

  7. sigma model approach to the heterotic string theory

    International Nuclear Information System (INIS)

    Sen, A.

    1985-09-01

    Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs

  8. Theory summary

    International Nuclear Information System (INIS)

    Tang, W.M.

    2001-01-01

    This is a summary of the advances in magnetic fusion energy theory research presented at the 17th International Atomic Energy Agency Fusion Energy Conference from 19 24 October, 1998 in Yokohama, Japan. Theory and simulation results from this conference provided encouraging evidence of significant progress in understanding the physics of thermonuclear plasmas. Indeed, the grand challenge for this field is to acquire the basic understanding that can readily enable the innovations which would make fusion energy practical. In this sense, research in fusion energy is increasingly able to be categorized as fitting well the 'Pasteur's Quadrant' paradigm, where the research strongly couples basic science ('Bohr's Quadrant') to technological impact ('Edison's Quadrant'). As supported by some of the work presented at this conference, this trend will be further enhanced by advanced simulations. Eventually, realistic three-dimensional modeling capabilities, when properly combined with rapid and complete data interpretation of results from both experiments and simulations, can contribute to a greatly enhanced cycle of understanding and innovation. Plasma science theory and simulation have provided reliable foundations for this improved modeling capability, and the exciting advances in high-performance computational resources have further accelerated progress. There were 68 papers presented at this conference in the area of magnetic fusion energy theory

  9. The logical foundations of scientific theories languages, structures, and models

    CERN Document Server

    Krause, Decio

    2016-01-01

    This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...

  10. Planar N = 4 gauge theory and the Hubbard model

    International Nuclear Information System (INIS)

    Rej, Adam; Serban, Didina; Staudacher, Matthias

    2006-01-01

    Recently it was established that a certain integrable long-range spin chain describes the dilatation operator of N = 4 gauge theory in the su(2) sector to at least three-loop order, while exhibiting BMN scaling to all orders in perturbation theory. Here we identify this spin chain as an approximation to an integrable short-ranged model of strongly correlated electrons: The Hubbard model

  11. Towards a Theory of Managing Wicked Problems through Multi-Stakeholder Engagements: Evidence from the Agribusiness Sector

    NARCIS (Netherlands)

    Dentoni, D.; Ross, R.

    2013-01-01

    Part Two of our Special Issue on wicked problems in agribusiness, “Towards a Theory of Managing Wicked Problems through Multi-Stakeholder Engagements: Evidence from the Agribusiness Sector,” will contribute to four open questions in the broader fields of management and policy: why, when, which and

  12. Models with oscillator terms in noncommutative quantum field theory

    International Nuclear Information System (INIS)

    Kronberger, E.

    2010-01-01

    The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de

  13. A Novel Wide-Area Backup Protection Based on Fault Component Current Distribution and Improved Evidence Theory

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2014-01-01

    Full Text Available In order to solve the problems of the existing wide-area backup protection (WABP algorithms, the paper proposes a novel WABP algorithm based on the distribution characteristics of fault component current and improved Dempster/Shafer (D-S evidence theory. When a fault occurs, slave substations transmit to master station the amplitudes of fault component currents of transmission lines which are the closest to fault element. Then master substation identifies suspicious faulty lines according to the distribution characteristics of fault component current. After that, the master substation will identify the actual faulty line with improved D-S evidence theory based on the action states of traditional protections and direction components of these suspicious faulty lines. The simulation examples based on IEEE 10-generator-39-bus system show that the proposed WABP algorithm has an excellent performance. The algorithm has low requirement of sampling synchronization, small wide-area communication flow, and high fault tolerance.

  14. A Novel Wide-Area Backup Protection Based on Fault Component Current Distribution and Improved Evidence Theory

    Science.gov (United States)

    Zhang, Zhe; Kong, Xiangping; Yin, Xianggen; Yang, Zengli; Wang, Lijun

    2014-01-01

    In order to solve the problems of the existing wide-area backup protection (WABP) algorithms, the paper proposes a novel WABP algorithm based on the distribution characteristics of fault component current and improved Dempster/Shafer (D-S) evidence theory. When a fault occurs, slave substations transmit to master station the amplitudes of fault component currents of transmission lines which are the closest to fault element. Then master substation identifies suspicious faulty lines according to the distribution characteristics of fault component current. After that, the master substation will identify the actual faulty line with improved D-S evidence theory based on the action states of traditional protections and direction components of these suspicious faulty lines. The simulation examples based on IEEE 10-generator-39-bus system show that the proposed WABP algorithm has an excellent performance. The algorithm has low requirement of sampling synchronization, small wide-area communication flow, and high fault tolerance. PMID:25050399

  15. Integrable lambda models and Chern-Simons theories

    International Nuclear Information System (INIS)

    Schmidtt, David M.

    2017-01-01

    In this note we reveal a connection between the phase space of lambda models on S 1 ×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 ×S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  16. Mixing methodology, nursing theory and research design for a practice model of district nursing advocacy.

    Science.gov (United States)

    Reed, Frances M; Fitzgerald, Les; Rae, Melanie

    2016-01-01

    To highlight philosophical and theoretical considerations for planning a mixed methods research design that can inform a practice model to guide rural district nursing end of life care. Conceptual models of nursing in the community are general and lack guidance for rural district nursing care. A combination of pragmatism and nurse agency theory can provide a framework for ethical considerations in mixed methods research in the private world of rural district end of life care. Reflection on experience gathered in a two-stage qualitative research phase, involving rural district nurses who use advocacy successfully, can inform a quantitative phase for testing and complementing the data. Ongoing data analysis and integration result in generalisable inferences to achieve the research objective. Mixed methods research that creatively combines philosophical and theoretical elements to guide design in the particular ethical situation of community end of life care can be used to explore an emerging field of interest and test the findings for evidence to guide quality nursing practice. Combining philosophy and nursing theory to guide mixed methods research design increases the opportunity for sound research outcomes that can inform a nursing model of care.

  17. Stochastic quantization of field theories on the lattice and supersymmetrical models

    International Nuclear Information System (INIS)

    Aldazabal, Gerardo.

    1984-01-01

    Several aspects of the stochastic quantization method are considered. Specifically, field theories on the lattice and supersymmetrical models are studied. A non-linear sigma model is studied firstly, and it is shown that it is possible to obtain evolution equations written directly for invariant quantities. These ideas are generalized to obtain Langevin equations for the Wilson loops of non-abelian lattice gauge theories U (N) and SU (N). In order to write these equations, some different ways of introducing the constraints which the fields must satisfy are discussed. It is natural to have a strong coupling expansion in these equations. The correspondence with quantum field theory is established, and it is noticed that at all orders in the perturbation theory, Langevin equations reduce to Schwinger-Dyson equations. From another point of view, stochastic quantization is applied to large N matrix models on the lattice. As a result, a simple and systematic way of building reduced models is found. Referring to stochastic quantization in supersymmetric theories, a simple supersymmetric model is studied. It is shown that it is possible to write an evolution equation for the superfield wich leads to quantum field theory results in equilibrium. As the Langevin equation preserves supersymmetry, the property of dimensional reduction known for the quantum model is shown to be valid at all times. (M.E.L.) [es

  18. Spin foam models of Yang-Mills theory coupled to gravity

    International Nuclear Information System (INIS)

    Mikovic, A

    2003-01-01

    We construct a spin foam model of Yang-Mills theory coupled to gravity by using a discretized path integral of the BF theory with polynomial interactions and the Barrett-Crane ansatz. In the Euclidean gravity case, we obtain a vertex amplitude which is determined by a vertex operator acting on a simple spin network function. The Euclidean gravity results can be straightforwardly extended to the Lorentzian case, so that we propose a Lorentzian spin foam model of Yang-Mills theory coupled to gravity

  19. Orbifolds of M-theory and type II string theories in two dimensions

    International Nuclear Information System (INIS)

    Roy, S.

    1997-01-01

    We consider several orbifold compactifications of M-theory and theircorresponding type II duals in two space-time dimensions. In particular, we show that while the orbifold compactification of M-theory on T 9 /J 9 is dual to the orbifold compactification of type IIB string theory on T 8 /I 8 , the same orbifold T 8 /I 8 of type IIA string theory is dual to M-theory compactified on a smooth product manifold K3 x T 5 . Similarly, while the orbifold compactification of M-theory on (K3 x T 5 )/σ. J 5 is dual to the orbifold compactification of type IIB string theory on (K3 x T 4 )/σ.I 4 , the same orbifold of type IIA string theory is dual to the orbifold T 4 x (K3 x S 1 )/σ.J 1 of M-theory. The spectrum of various orbifold compactifications of M-theory and type II string theories on both sides are compared giving evidence in favor of these duality conjectures. We also comment on a connection between the Dasgupta-Mukhi-Witten conjecture and the Dabholkar-Park-Sen conjecture for the six-dimensional orbifold models of type IIB string theory and M-theory. (orig.)

  20. Semantic Modelling of Digital Forensic Evidence

    Science.gov (United States)

    Kahvedžić, Damir; Kechadi, Tahar

    The reporting of digital investigation results are traditionally carried out in prose and in a large investigation may require successive communication of findings between different parties. Popular forensic suites aid in the reporting process by storing provenance and positional data but do not automatically encode why the evidence is considered important. In this paper we introduce an evidence management methodology to encode the semantic information of evidence. A structured vocabulary of terms, ontology, is used to model the results in a logical and predefined manner. The descriptions are application independent and automatically organised. The encoded descriptions aim to help the investigation in the task of report writing and evidence communication and can be used in addition to existing evidence management techniques.

  1. Working memory: theories, models, and controversies.

    Science.gov (United States)

    Baddeley, Alan

    2012-01-01

    I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.

  2. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  3. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  4. Matrix models and stochastic growth in Donaldson-Thomas theory

    Energy Technology Data Exchange (ETDEWEB)

    Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)

    2012-10-15

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  5. Matrix models and stochastic growth in Donaldson-Thomas theory

    International Nuclear Information System (INIS)

    Szabo, Richard J.; Tierz, Miguel

    2012-01-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  6. Soliton excitations in polyacetylene and relativistic field theory models

    International Nuclear Information System (INIS)

    Campbell, D.K.; Bishop, A.R.; Los Alamos Scientific Lab., NM

    1982-01-01

    A continuum model of a Peierls-dimerized chain, as described generally by Brazovskii and discussed for the case of polyacetylene by Takayama, Lin-Liu and Maki (TLM), is considered. The continuum (Bogliubov-de Gennes) equations arising in this model of interacting electrons and phonons are shown to be equivalent to the static, semiclassical equations for a solvable model field theory of self-coupled fermions - the N = 2 Gross-Neveu model. Based on this equivalence we note the existence of soliton defect states in polyacetylene that are additional to, and qualitatively different from, the amplitude kinks commonly discussed. The new solutions do not have the topological stability of kinks but are essentially conventional strong-coupling polarons in the dimerized chain. They carry spin (1/2) and charge (+- e). In addition, we discuss further areas in which known field theory results may apply to a Peierls-dimerized chain, including relations between phenomenological PHI 4 and continuuum electron-phonon models, and the structure of the fully quantum versus mean field theories. (orig.)

  7. Envisaging the use of evidence-based practice (EBP): how nurse academics facilitate EBP use in theory and practice across Australian undergraduate programmes.

    Science.gov (United States)

    Malik, Gulzar; McKenna, Lisa; Griffiths, Debra

    2017-09-01

    This paper is drawn from a grounded theory study that aimed to investigate processes undertaken by academics when integrating evidence-based practice into undergraduate curricula. This paper focuses on how nurse academics facilitated students to apply evidence-based practice in theory and practice. Facilitating undergraduate nursing students to develop skills within an evidence-based practice framework is vital to achieving evidence-based care. Studies on evidence-based practice conducted globally suggests that there is a need to investigate approaches used by nurse academics in facilitating students' understanding and use of evidence-based practice during their nurse education. Employing constructivist grounded theory approach, 23 nurse academics across Australian universities were interviewed and nine observed during their teaching. Some study participants shared their unit guides to enrich analysis. Data analysis was performed by following Charmaz's approach of coding procedures; as a result, four categories were constructed. This paper focuses on the category conceptualised as Envisaging the use of evidence-based practice. Findings revealed that most academics-assisted students to use evidence in academic-related activities. Recognising the importance of evidence-based practice in practice, some also expected students to apply evidence-based practice during clinical experiences. However, the level of students' appreciation for evidence-based practice during clinical experiences was unknown to participants and was influenced by practice-related barriers. Acknowledging these challenges, academics were engaged in dialogue with students and suggested the need for academia-practice collaboration in combating the cited barriers. Ensuring academics are supported to emphasise clinical application of evidence-based practice requires strategies at school and practice levels. Faculty development, engagement of clinical nurses with evidence-based practice, supportive

  8. A survey on the modeling and applications of cellular automata theory

    Science.gov (United States)

    Gong, Yimin

    2017-09-01

    The Cellular Automata Theory is a discrete model which is now widely used in scientific researches and simulations. The model is comprised of some cells which changes according to a specific rule over time. This paper provides a survey of the Modeling and Applications of Cellular Automata Theory, which focus on the program realization of Cellular Automata Theory and the application of Cellular Automata in each field, such as road traffic, land use, and cutting machines. Each application is further explained, and several related main models are briefly introduced. This research aims to help decision-makers formulate appropriate development plans.

  9. Brief Report: An Independent Replication and Extension of Psychometric Evidence Supporting the Theory of Mind Inventory.

    Science.gov (United States)

    Greenslade, Kathryn J; Coggins, Truman E

    2016-08-01

    This study presents an independent replication and extension of psychometric evidence supporting the Theory of Mind Inventory (ToMI). Parents of 20 children with ASD (4; 1-6; 7 years; months) and 20 with typical development (3; 1-6; 5), rated their child's theory of mind abilities in everyday situations. Other parent report and child behavioral assessments included the Social Responsiveness Scale-2, Vineland Adaptive Behavior Scales-2, Peabody Picture Vocabulary Test-4, and Clinical Evaluation of Language Fundamentals-Preschool, 2. Results revealed high internal consistency, expected developmental changes in children with typical development, expected group differences between children with and without ASD, and strong correlations with other measures of social and communication abilities. The ToMI demonstrates strong psychometrics, suggesting considerable utility in identifying theory of mind deficits in children with ASD.

  10. Introduction to zeolite theory and modelling

    NARCIS (Netherlands)

    Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.

    2001-01-01

    A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the

  11. Integrable models in 1+1 dimensional quantum field theory

    International Nuclear Information System (INIS)

    Faddeev, Ludvig.

    1982-09-01

    The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR

  12. Integrable lambda models and Chern-Simons theories

    Energy Technology Data Exchange (ETDEWEB)

    Schmidtt, David M. [Departamento de Física, Universidade Federal de São Carlos,Caixa Postal 676, CEP 13565-905, São Carlos-SP (Brazil)

    2017-05-03

    In this note we reveal a connection between the phase space of lambda models on S{sup 1}×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS{sub 5}×S{sup 5} lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  13. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  14. A dynamical theory for the Rishon model

    International Nuclear Information System (INIS)

    Harari, H.; Seiberg, N.

    1980-09-01

    We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)

  15. A Trust Evaluation Algorithm for Wireless Sensor Networks Based on Node Behaviors and D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Jiangwen Wan

    2011-01-01

    Full Text Available For wireless sensor networks (WSNs, many factors, such as mutual interference of wireless links, battlefield applications and nodes exposed to the environment without good physical protection, result in the sensor nodes being more vulnerable to be attacked and compromised. In order to address this network security problem, a novel trust evaluation algorithm defined as NBBTE (Node Behavioral Strategies Banding Belief Theory of the Trust Evaluation Algorithm is proposed, which integrates the approach of nodes behavioral strategies and modified evidence theory. According to the behaviors of sensor nodes, a variety of trust factors and coefficients related to the network application are established to obtain direct and indirect trust values through calculating weighted average of trust factors. Meanwhile, the fuzzy set method is applied to form the basic input vector of evidence. On this basis, the evidence difference is calculated between the indirect and direct trust values, which link the revised D-S evidence combination rule to finally synthesize integrated trust value of nodes. The simulation results show that NBBTE can effectively identify malicious nodes and reflects the characteristic of trust value that ‘hard to acquire and easy to lose’. Furthermore, it is obvious that the proposed scheme has an outstanding advantage in terms of illustrating the real contribution of different nodes to trust evaluation.

  16. Can theory be embedded in visual interventions to promote self-management? A proposed model and worked example.

    Science.gov (United States)

    Williams, B; Anderson, A S; Barton, K; McGhee, J

    2012-12-01

    Nurses are increasingly involved in a range of strategies to encourage patient behaviours that improve self-management. If nurses are to be involved in, or indeed lead, the development of such interventions then processes that enhance the likelihood that they will lead to evidence that is both robust and usable in practice are required. Although behavioural interventions have been predominantly based on written text or the spoken word increasing numbers are now drawing on visual media to communicate their message, despite only a growing evidence base to support it. The use of such media in health interventions is likely to increase due to technological advances enabling easier and cheaper production, and an increasing social preference for visual forms of communication. However, the development of such media is often highly pragmatic and developed intuitively rather than with theory and evidence informing their content and form. Such a process may be at best inefficient and at worst potentially harmful. This paper performs two functions. Firstly, it discusses and argues why visual based interventions may be a powerful media for behaviour change; and secondly, it proposes a model, developed from the MRC Framework for the Development and Evaluation of Complex Interventions, to guide the creation of theory informed visual interventions. It employs a case study of the development of an intervention to motivate involvement in a lifestyle intervention among people with increased cardiovascular risk. In doing this we argue for a step-wise model which includes: (1) the identification of a theoretical basis and associated concepts; (2) the development of visual narrative to establish structure; (3) the visual rendering of narrative and concepts; and (4) the assessment of interpretation and impact among the intended patient group. We go on to discuss the theoretical and methodological limitations of the model. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Nursing opinion leadership: a preliminary model derived from philosophic theories of rational belief.

    Science.gov (United States)

    Anderson, Christine A; Whall, Ann L

    2013-10-01

    Opinion leaders are informal leaders who have the ability to influence others' decisions about adopting new products, practices or ideas. In the healthcare setting, the importance of translating new research evidence into practice has led to interest in understanding how opinion leaders could be used to speed this process. Despite continued interest, gaps in understanding opinion leadership remain. Agent-based models are computer models that have proven to be useful for representing dynamic and contextual phenomena such as opinion leadership. The purpose of this paper is to describe the work conducted in preparation for the development of an agent-based model of nursing opinion leadership. The aim of this phase of the model development project was to clarify basic assumptions about opinions, the individual attributes of opinion leaders and characteristics of the context in which they are effective. The process used to clarify these assumptions was the construction of a preliminary nursing opinion leader model, derived from philosophical theories about belief formation. © 2013 John Wiley & Sons Ltd.

  18. Evidence for the multiverse in the standard model and beyond

    International Nuclear Information System (INIS)

    Hall, Lawrence J.; Nomura, Yasunori

    2008-01-01

    In any theory it is unnatural if the observed values of parameters lie very close to special values that determine the existence of complex structures necessary for observers. A naturalness probability P is introduced to numerically evaluate the degree of unnaturalness. If P is very small in all known theories, corresponding to a high degree of fine-tuning, then there is an observer naturalness problem. In addition to the well-known case of the cosmological constant, we argue that nuclear stability and electroweak symmetry breaking represent significant observer naturalness problems. The naturalness probability associated with nuclear stability depends on the theory of flavor, but for all known theories is conservatively estimated as P nuc -3 -10 -2 ), and for simple theories of electroweak symmetry breaking P EWSB -2 -10 -1 ). This pattern of unnaturalness in three different arenas, cosmology, nuclear physics, and electroweak symmetry breaking, provides evidence for the multiverse, since each problem may be easily solved by environmental selection. In the nuclear case the problem is largely solved even if the multiverse distribution for the relevant parameters is relatively flat. With somewhat strongly varying distributions, it is possible to understand both the close proximity to neutron stability and the values of m e and m d -m u in terms of the electromagnetic mass difference between the proton and neutron, δ EM ≅1±0.5 MeV. It is reasonable that multiverse distributions are strong functions of Lagrangian parameters, since they depend not only on the landscape of vacua, but also on the population mechanism, ''integrating out'' other parameters, and on a density of observers factor. In any theory with mass scale M that is the origin of electroweak symmetry breaking, strongly varying multiverse distributions typically lead either to a little hierarchy v/M≅(10 -2 -10 -1 ), or to a large hierarchy v 2 /M 2 suppressed by an extra loop factor, as well as by the

  19. Membrane models and generalized Z2 gauge theories

    International Nuclear Information System (INIS)

    Lowe, M.J.; Wallace, D.J.

    1980-01-01

    We consider models of (d-n)-dimensional membranes fluctuating in a d-dimensional space under the action of surface tension. We investigate the renormalization properties of these models perturbatively and in 1/n expansion. The potential relationships of these models to generalized Z 2 gauge theories are indicated. (orig.)

  20. Relative judgment theory and the mediation of facial recognition: Implications for theories of eyewitness identification.

    Science.gov (United States)

    McAdoo, Ryan M; Gronlund, Scott D

    2016-01-01

    Many in the eyewitness identification community believe that sequential lineups are superior to simultaneous lineups because simultaneous lineups encourage inappropriate choosing due to promoting comparisons among choices (a relative judgment strategy), but sequential lineups reduce this propensity by inducing comparisons of lineup members directly to memory rather than to each other (an absolute judgment strategy). Different versions of the relative judgment theory have implicated both discrete-state and continuous mediation of eyewitness decisions. The theory has never been formally specified, but (Yonelinas, J Exp Psychol Learn Mem Cogn 20:1341-1354, 1994) dual-process models provide one possible specification, thereby allowing us to evaluate how eyewitness decisions are mediated. We utilized a ranking task (Kellen and Klauer, J Exp Psychol Learn Mem Cogn 40:1795-1804, 2014) and found evidence for continuous mediation when facial stimuli match from study to test (Experiment 1) and when they mismatch (Experiment 2). This evidence, which is contrary to a version of relative judgment theory that has gained a lot of traction in the legal community, compels reassessment of the role that guessing plays in eyewitness identification. Future research should continue to test formal explanations in order to advance theory, expedite the development of new procedures that can enhance the reliability of eyewitness evidence, and to facilitate the exploration of task factors and emergent strategies that might influence when recognition is continuously or discretely mediated.

  1. Integrability of a family of quantum field theories related to sigma models

    Energy Technology Data Exchange (ETDEWEB)

    Ridout, David [Australian National Univ., Canberra, ACT (Australia). Dept. of Theoretical Physics; DESY, Hamburg (Germany). Theory Group; Teschner, Joerg [DESY, Hamburg (Germany). Theory Group

    2011-03-15

    A method is introduced for constructing lattice discretizations of large classes of integrable quantum field theories. The method proceeds in two steps: The quantum algebraic structure underlying the integrability of the model is determined from the algebra of the interaction terms in the light-cone representation. The representation theory of the relevant quantum algebra is then used to construct the basic ingredients of the quantum inverse scattering method, the lattice Lax matrices and R-matrices. This method is illustrated with four examples: The Sinh-Gordon model, the affine sl(3) Toda model, a model called the fermionic sl(2 vertical stroke 1) Toda theory, and the N=2 supersymmetric Sine-Gordon model. These models are all related to sigma models in various ways. The N=2 supersymmetric Sine-Gordon model, in particular, describes the Pohlmeyer reduction of string theory on AdS{sub 2} x S{sup 2}, and is dual to a supersymmetric non-linear sigma model with a sausage-shaped target space. (orig.)

  2. Models and theories of prescribing decisions: A review and suggested a new model

    Science.gov (United States)

    Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701

  3. Models and theories of prescribing decisions: A review and suggested a new model

    Directory of Open Access Journals (Sweden)

    Ali Murshid M

    2017-06-01

    Full Text Available To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  4. New evidence on the asymmetry in gasoline price: volatility versus margin?

    International Nuclear Information System (INIS)

    Abosedra, S.; Radchenko, S.

    2006-01-01

    This paper examines recent evidence on the role that gasoline margins and volatility play in the asymmetric response of gasoline prices to changes in oil prices at different stages of distribution process. In a regression model with margins, we find that margins are statistically significant in explaining asymmetry between crude oil and spot gasoline prices, spot gasoline prices and wholesale gasoline prices, and wholesale gasoline prices and retail prices. In a regression model with input volatility, we find evidence that volatility is responsible for asymmetry between wholesale gasoline prices and retail gasoline prices. When both, gasoline margins and gasoline volatility are included in the regression, we find evidence supporting margins, the search theory, volatility, the oligopolistic coordination theory and an explanation of asymmetry. (author)

  5. A new non-specificity measure in evidence theory based on belief intervals

    Institute of Scientific and Technical Information of China (English)

    Yang Yi; Han Deqiang; Jean Dezert

    2016-01-01

    In the theory of belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity. For the non-specificity part, some traditional measures use for reference the Hartley measure in classical set theory;other traditional measures use the simple and heuristic function for joint use of mass assignments and the cardinality of focal elements. In this paper, a new non-specificity measure is proposed using lengths of belief intervals, which represent the degree of imprecision. Therefore, it has more intuitive physical meaning. It can be proved that our new measure can be rewritten in a general form for the non-specificity. Our new measure is also proved to be a strict non-specificity measure with some desired properties. Numerical examples, simulations, the related analyses and proofs are provided to show the characteristics and good properties of the new non-specificity definition. An example of an application of the new non-specificity measure is also presented.

  6. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  7. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  8. Dualities in M-theory and Born-Infeld Theory

    International Nuclear Information System (INIS)

    Brace, Daniel M.

    2001-01-01

    We discuss two examples of duality. The first arises in the context of toroidal compactification of the discrete light cone quantization of M-theory. In the presence of nontrivial moduli coming from the M-theory three form, it has been conjectured that the system is described by supersymmetric Yang-Mills gauge theory on a noncommutative torus. We are able to provide evidence for this conjecture, by showing that the dualities of this M-theory compactification, which correspond to T-duality in Type IIA string theory, are also dualities of the noncommutative supersymmetric Yang-Mills description. One can also consider this as evidence for the accuracy of the Matrix Theory description of M-theory in this background. The second type of duality is the self-duality of theories with U(1) gauge fields. After discussing the general theory of duality invariance for theories with complex gauge fields, we are able to find a generalization of the well known U(1) Born-Infeld theory that contains any number of gauge fields and which is invariant under the maximal duality group. We then find a supersymmetric extension of our results, and also show that our results can be extended to find Born-Infeld type actions in any even dimensional spacetime

  9. Targeting the Real Exchange Rate; Theory and Evidence

    OpenAIRE

    Carlos A. Végh Gramont; Guillermo Calvo; Carmen Reinhart

    1994-01-01

    This paper presents a theoretical and empirical analysis of policies aimed at setting a more depreciated level of the real exchange rate. An intertemporal optimizing model suggests that, in the absence of changes in fiscal policy, a more depreciated level of the real exchange can only be attained temporarily. This can be achieved by means of higher inflation and/or higher real interest rates, depending on the degree of capital mobility. Evidence for Brazil, Chile, and Colombia supports the mo...

  10. Export Growth and Factor Market Competition: Theory and Some Evidence

    NARCIS (Netherlands)

    J. Emami Namini (Julian); G. Facchini (Giovanni); R.A. Lopez (Ricrado)

    2011-01-01

    textabstractEmpirical evidence suggests that sectoral export growth decreases exporters' survival probability, whereas this is not true for non-exporters. Models with firm heterogeneity in total factor productivity (TFP) predict the opposite. To solve this puzzle, we develop a two{factor framework

  11. Cycle frequency in standard Rock-Paper-Scissors games: Evidence from experimental economics

    Science.gov (United States)

    Xu, Bin; Zhou, Hai-Jun; Wang, Zhijian

    2013-10-01

    The Rock-Paper-Scissors (RPS) game is a widely used model system in game theory. Evolutionary game theory predicts the existence of persistent cycles in the evolutionary trajectories of the RPS game, but experimental evidence has remained to be rather weak. In this work, we performed laboratory experiments on the RPS game and analyzed the social-state evolutionary trajectories of twelve populations of N=6 players. We found strong evidence supporting the existence of persistent cycles. The mean cycling frequency was measured to be 0.029±0.009 period per experimental round. Our experimental observations can be quantitatively explained by a simple non-equilibrium model, namely the discrete-time logit dynamical process with a noise parameter. Our work therefore favors the evolutionary game theory over the classical game theory for describing the dynamical behavior of the RPS game.

  12. Superfield theory and supermatrix model

    International Nuclear Information System (INIS)

    Park, Jeong-Hyuck

    2003-01-01

    We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)

  13. σ-models and string theories

    International Nuclear Information System (INIS)

    Randjbar-Daemi, S.

    1987-01-01

    The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix

  14. An aggregate method to calibrate the reference point of cumulative prospect theory-based route choice model for urban transit network

    Science.gov (United States)

    Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia

    2015-12-01

    Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.

  15. The motivation to care: application and extension of motivation theory to professional nursing work.

    Science.gov (United States)

    Moody, Roseanne C; Pesut, Daniel J

    2006-01-01

    The purpose of this research is to describe a model of nurses' work motivation relevant to the human caring stance of professional nursing work. The model was derived from selected theories of behavioral motivation and work motivation. Evidence-based theory addressing nurses' work motivation and nurses' motivational states and traits in relation to characteristics of organizational culture and patient health outcomes is suggested in an effort to make a distinct contribution to health services research. An integrated review of selected theories of motivation is presented, including conceptual analyses, theory-building techniques, and the evidence supporting the theoretical propositions and linkages among variables intrinsic to nurses' work motivation. The model of the Motivation to Care for Professional Nursing Work is a framework intended for empirical testing and theory building. The model proposes specific leadership and management strategies to support a culture of motivational caring and competence in health care organizations. Attention to motivation theory and research provides insights and suggests relationships among nurses' motivation to care, motivational states and traits, individual differences that influence nurses' work motivation, and the special effects of nurses' work motivation on patient care outcomes. Suggestions for nursing administrative direction and research are proposed.

  16. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D

    2014-01-01

    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  17. Micro worlds versus boundary objects in group model building; evidence from the literature on problem definition and model conceptulization

    Energy Technology Data Exchange (ETDEWEB)

    Zagonel, Aldo A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Engineering & Analysis; Andersen, David F. [University in Albany, NY (United States). The Rockefeller College of Public Affairs & Policy

    2007-03-01

    Based upon participant observation in group model building and content analysis of the system dynamics literature, we postulate that modeling efforts have a dual nature. On one hand, the modeling process aims to create a useful representation of a real-world system. This must be done, however, while aligning the clients’ mental models around a shared view of the system. There is significant overlap and confusion between these two goals and how they play out on a practical level. This research clarifies these distinctions by establishing an ideal-type dichotomy. To highlight the differences, we created two straw men: “micro world” characterizes a model that represents reality and “boundary object” represents a socially negotiated model. Using this framework, the literature was examined, revealing evidence for several competing views on problem definition and model conceptualization. The results are summarized in the text of this article, substantiated with strikingly polarized citations, often from the same authors. We also introduce hypotheses for the duality across the remaining phases of the modeling process. Finally, understanding and appreciation of the differences between these ideal types can promote constructive debate on their balance in system dynamics theory and practice.

  18. Aligning method with theory: a comparison of two approaches to modeling the social determinants of health.

    Science.gov (United States)

    O'Campo, Patricia; Urquia, Marcelo

    2012-12-01

    There is increasing interest in the study of the social determinants of maternal and child health. While there has been growth in the theory and empirical evidence about social determinants, less attention has been paid to the kind of modeling that should be used to understand the impact of social exposures on well-being. We analyzed data from the nationwide 2006 Canadian Maternity Experiences Survey to compare the pervasive disease-specific model to a model that captures the generalized health impact (GHI) of social exposures, namely low socioeconomic position. The GHI model uses a composite of adverse conditions that stem from low socioeconomic position: adverse birth outcomes, postpartum depression, severe abuse, stressful life events, and hospitalization during pregnancy. Adjusted prevalence ratios and 95% confidence intervals from disease-specific models for low income (social determinants of health.

  19. Quantum theory in complex Hilbert space

    International Nuclear Information System (INIS)

    Sharma, C.S.

    1988-01-01

    The theory of complexification of a real Hilbert space as developed by the author is scrutinized with the aim of explaining why quantum theory should be done in a complex Hilbert space in preference to real Hilbert space. It is suggested that, in order to describe periodic motions in stationary states of a quantum system, the mathematical object modelling a state of a system should have enough points in it to be able to describe explicit time dependence of a periodic motion without affecting the probability distributions of observables. Heuristic evidence for such an assumption comes from Dirac's theory of interaction between radiation and matter. If the assumption is adopted as a requirement on the mathematical model for a quantum system, then a real Hilbert space is ruled out in favour of a complex Hilbert space for a possible model for such a system

  20. Prior Knowledge and the Learning of Science. A Review of Ausubel's Theory of This Process

    Science.gov (United States)

    West, L. H. T.; Fensham, P. J.

    1974-01-01

    Examines Ausubel's theory of learning as a model of the role concerning the influence of prior knowledge on how learning occurs. Research evidence for Ausubel's theory is presented and discussed. Implications of Ausubel's theory for teaching are summarized. (PEB)

  1. Testing Theories of Recognition Memory by Predicting Performance Across Paradigms

    Science.gov (United States)

    Smith, David G.; Duncan, Matthew J. J.

    2004-01-01

    Signal-detection theory (SDT) accounts of recognition judgments depend on the assumption that recognition decisions result from a single familiarity-based process. However, fits of a hybrid SDT model, called dual-process theory (DPT), have provided evidence for the existence of a second, recollection-based process. In 2 experiments, the authors…

  2. Diagrammatic group theory in quark models

    International Nuclear Information System (INIS)

    Canning, G.P.

    1977-05-01

    A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de

  3. Disrupted cortical connectivity theory as an explanatory model for autism spectrum disorders.

    Science.gov (United States)

    Kana, Rajesh K; Libero, Lauren E; Moore, Marie S

    2011-12-01

    Recent findings of neurological functioning in autism spectrum disorder (ASD) point to altered brain connectivity as a key feature of its pathophysiology. The cortical underconnectivity theory of ASD (Just et al., 2004) provides an integrated framework for addressing these new findings. This theory suggests that weaker functional connections among brain areas in those with ASD hamper their ability to accomplish complex cognitive and social tasks successfully. We will discuss this theory, but will modify the term underconnectivity to 'disrupted cortical connectivity' to capture patterns of both under- and over-connectivity in the brain. In this paper, we will review the existing literature on ASD to marshal supporting evidence for hypotheses formulated on the disrupted cortical connectivity theory. These hypotheses are: 1) underconnectivity in ASD is manifested mainly in long-distance cortical as well as subcortical connections rather than in short-distance cortical connections; 2) underconnectivity in ASD is manifested only in complex cognitive and social functions and not in low-level sensory and perceptual tasks; 3) functional underconnectivity in ASD may be the result of underlying anatomical abnormalities, such as problems in the integrity of white matter; 4) the ASD brain adapts to underconnectivity through compensatory strategies such as overconnectivity mainly in frontal and in posterior brain areas. This may be manifested as deficits in tasks that require frontal-parietal integration. While overconnectivity can be tested by examining the cortical minicolumn organization, long-distance underconnectivity can be tested by cognitively demanding tasks; and 5) functional underconnectivity in brain areas in ASD will be seen not only during complex tasks but also during task-free resting states. We will also discuss some empirical predictions that can be tested in future studies, such as: 1) how disrupted connectivity relates to cognitive impairments in skills such

  4. Evidence of "Implemented Anticipation" in Mathematising by Beginning Modellers

    Science.gov (United States)

    Stillman, Gloria; Brown, Jill P.

    2014-01-01

    Data from open modelling sessions for year 10 and 11 students at an extracurricular modelling event and from a year 9 class participating in a programme of structured modelling of real situations were analysed for evidence of Niss's theoretical construct, "implemented anticipation," during mathematisation. Evidence was found for all…

  5. Chaos Theory as a Model for Managing Issues and Crises.

    Science.gov (United States)

    Murphy, Priscilla

    1996-01-01

    Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…

  6. Discrete state moduli of string theory from c=1 matrix model

    CERN Document Server

    Dhar, A; Wadia, S R; Dhar, Avinash; Mandal, Gautam; Wadia, Spenta R

    1995-01-01

    We propose a new formulation of the space-time interpretation of the c=1 matrix model. Our formulation uses the well-known leg-pole factor that relates the matrix model amplitudes to that of the 2-dimensional string theory, but includes fluctuations around the fermi vacuum on {\\sl both sides} of the inverted harmonic oscillator potential of the double-scaled model, even when the fluctuations are small and confined entirely within the asymptotes in the phase plane. We argue that including fluctuations on both sides of the potential is essential for a consistent interpretation of the leg-pole transformed theory as a theory of space-time gravity. We reproduce the known results for the string theory tree level scattering amplitudes for flat space and linear dilaton background as a special case. We show that the generic case corresponds to more general space-time backgrounds. In particular, we identify the parameter corresponding to background metric perturbation in string theory (black hole mass) in terms of the ...

  7. Nonperturbative type IIB model building in the F-theory framework

    Energy Technology Data Exchange (ETDEWEB)

    Jurke, Benjamin Helmut Friedrich

    2011-02-28

    -realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  8. Nonperturbative type IIB model building in the F-theory framework

    International Nuclear Information System (INIS)

    Jurke, Benjamin Helmut Friedrich

    2011-01-01

    -realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  9. Cohomological gauge theory, quiver matrix models and Donaldson-Thomas theoryCohomological gauge theory, quiver matrix models and Donaldson-Thomas theory

    NARCIS (Netherlands)

    Cirafici, M.; Sinkovics, A.; Szabo, R.J.

    2009-01-01

    We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques

  10. Applications of generalizability theory and their relations to classical test theory and structural equation modeling.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-03-01

    Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. The monster sporadic group and a theory underlying superstring models

    International Nuclear Information System (INIS)

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs

  12. Effective potential in Lorentz-breaking field theory models

    Energy Technology Data Exchange (ETDEWEB)

    Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)

    2017-12-15

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  13. Effective potential in Lorentz-breaking field theory models

    International Nuclear Information System (INIS)

    Baeta Scarpelli, A.P.; Brito, L.C.T.; Felipe, J.C.C.; Nascimento, J.R.; Petrov, A.Yu.

    2017-01-01

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  14. Noncommutative gauge theory and symmetry breaking in matrix models

    International Nuclear Information System (INIS)

    Grosse, Harald; Steinacker, Harold; Lizzi, Fedele

    2010-01-01

    We show how the fields and particles of the standard model can be naturally realized in noncommutative gauge theory. Starting with a Yang-Mills matrix model in more than four dimensions, an SU(n) gauge theory on a Moyal-Weyl space arises with all matter and fields in the adjoint of the gauge group. We show how this gauge symmetry can be broken spontaneously down to SU(3) c xSU(2) L xU(1) Q [resp. SU(3) c xU(1) Q ], which couples appropriately to all fields in the standard model. An additional U(1) B gauge group arises which is anomalous at low energies, while the trace-U(1) sector is understood in terms of emergent gravity. A number of additional fields arise, which we assume to be massive, in a pattern that is reminiscent of supersymmetry. The symmetry breaking might arise via spontaneously generated fuzzy spheres, in which case the mechanism is similar to brane constructions in string theory.

  15. Off-critical statistical models: factorized scattering theories and bootstrap program

    International Nuclear Information System (INIS)

    Mussardo, G.

    1992-01-01

    We analyze those integrable statistical systems which originate from some relevant perturbations of the minimal models of conformal field theories. When only massive excitations are present, the systems can be efficiently characterized in terms of the relativistic scattering data. We review the general properties of the factorizable S-matrix in two dimensions with particular emphasis on the bootstrap principle. The classification program of the allowed spins of conserved currents and of the non-degenerate S-matrices is discussed and illustrated by means of some significant examples. The scattering theories of several massive perturbations of the minimal models are fully discussed. Among them are the Ising model, the tricritical Ising model, the Potts models, the series of the non-unitary minimal models M 2,2n+3 , the non-unitary model M 3,5 and the scaling limit of the polymer system. The ultraviolet limit of these massive integrable theories can be exploited by the thermodynamics Bethe ansatz, in particular the central charge of the original conformal theories can be recovered from the scattering data. We also consider the numerical method based on the so-called conformal space truncated approach which confirms the theoretical results and allows a direct measurement of the scattering data, i.e. the masses and the S-matrix of the particles in bootstrap interaction. The problem of computing the off-critical correlation functions is discussed in terms of the form-factor approach

  16. The Scientific Theory Profile: A Philosophy of Science Model for Science Teachers.

    Science.gov (United States)

    Loving, Cathleen

    The model developed for use with science teachers--called the Scientific Theory Profile--consists of placing three well-known philosophers of science on a grid, with the x-axis being their methods for judging theories (rational vs. natural) and the y-axis being their views on scientific theories representing the Truth versus mere models of what…

  17. Preservice Biology Teachers' Conceptions About the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-02-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.

  18. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    Science.gov (United States)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  19. The Self-Perception Theory vs. a Dynamic Learning Model

    OpenAIRE

    Swank, Otto H.

    2006-01-01

    Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...

  20. Optimal velocity difference model for a car-following theory

    International Nuclear Information System (INIS)

    Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.

    2011-01-01

    In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.

  1. The erotetic theory of delusional thinking.

    Science.gov (United States)

    Parrott, Matthew; Koralus, Philipp

    2015-01-01

    In this paper, we argue for a novel account of one cognitive factor implicated in delusional cognition. According to the erotetic theory of delusion we present, the central cognitive factor in delusion is impaired endogenous question raising. After presenting the erotetic theory, we draw on it to model three distinct patterns of reasoning exhibited by delusional and schizophrenic patients, and contrast our explanations with Bayesian alternatives. We argue that the erotetic theory has considerable advantages over Bayesian models. Specifically, we show that it offers a superior explanation of three phenomena: the onset and persistence of the Capgras delusion; recent data indicating that schizophrenic subjects manifest superior reasoning with conditionals in certain contexts; and evidence that schizophrenic and delusional subjects have a tendency to "jump to conclusions." Moreover, since the cognitive mechanisms we appeal to are independently motivated, we avoid having to posit distinct epistemic states that are intrinsically irrational in order to fit our model to the variety of data. In contrast to Bayesian models, the erotetic theory offers a simple, unified explanation of a range of empirical data. We therefore conclude that it offers a more plausible framework for explaining delusional cognition.

  2. Models versus theories as a primary carrier of nursing knowledge: A philosophical argument.

    Science.gov (United States)

    Bender, Miriam

    2018-01-01

    Theories and models are not equivalent. I argue that an orientation towards models as a primary carrier of nursing knowledge overcomes many ongoing challenges in philosophy of nursing science, including the theory-practice divide and the paradoxical pursuit of predictive theories in a discipline that is defined by process and a commitment to the non-reducibility of the health/care experience. Scientific models describe and explain the dynamics of specific phenomenon. This is distinct from theory, which is traditionally defined as propositions that explain and/or predict the world. The philosophical case has been made against theoretical universalism, showing that a theory can be true in its domain, but that no domain is universal. Subsequently, philosophers focused on scientific models argued that they do the work of defining the boundary conditions-the domain(s)-of a theory. Further analysis has shown the ways models can be constructed and function independent of theory, meaning models can comprise distinct, autonomous "carriers of scientific knowledge." Models are viewed as representations of the active dynamics, or mechanisms, of a phenomenon. Mechanisms are entities and activities organized such that they are productive of regular changes. Importantly, mechanisms are by definition not static: change may alter the mechanism and thereby alter or create entirely new phenomena. Orienting away from theory, and towards models, focuses scholarly activity on dynamics and change. This makes models arguably critical to nursing science, enabling the production of actionable knowledge about the dynamics of process and change in health/care. I briefly explore the implications for nursing-and health/care-knowledge and practice. © 2017 John Wiley & Sons Ltd.

  3. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H

    2012-01-01

    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  4. Short-run Exchange-Rate Dynamics: Theory and Evidence

    DEFF Research Database (Denmark)

    Carlson, John A.; Dahl, Christian Møller; Osler, Carol L.

    Recent research has revealed a wealth of information about the microeconomics of currency markets and thus the determination of exchange rates at short horizons. This information is valuable to us as scientists since, like evidence of macroeconomic regularities, it can provide critical guidance...... of currency markets, it accurately reflects the constraints and objectives faced by the major participants, and it fits key stylized facts concerning returns and order flow. With respect to macroeconomics, the model is consistent with most of the major puzzles that have emerged under floating rates....

  5. Bureaucratic Minimal Squawk Behavior: Theory and Evidence from Regulatory Agencies

    OpenAIRE

    Clare Leaver

    2009-01-01

    This paper argues that bureaucrats are susceptible to `minimal squawk` behavior. I develop a simple model in which a desire to avoid criticism can prompt, otherwise public-spirited, bureaucrats to behave inefficiently. Decisions are taken to keep interest groups quiet and mistakes out of the public eye. The policy implications of this behavior are at odds with the received view that agencies should be structured to minimise the threat of `capture`. I test between theories of bureaucratic beha...

  6. The motor theory of speech perception revisited.

    Science.gov (United States)

    Massaro, Dominic W; Chen, Trevor H

    2008-04-01

    Galantucci, Fowler, and Turvey (2006) have claimed that perceiving speech is perceiving gestures and that the motor system is recruited for perceiving speech. We make the counter argument that perceiving speech is not perceiving gestures, that the motor system is not recruitedfor perceiving speech, and that speech perception can be adequately described by a prototypical pattern recognition model, the fuzzy logical model of perception (FLMP). Empirical evidence taken as support for gesture and motor theory is reconsidered in more detail and in the framework of the FLMR Additional theoretical and logical arguments are made to challenge gesture and motor theory.

  7. A study of the logical model of capital market complexity theories

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.

  8. Collective learning modeling based on the kinetic theory of active particles

    Science.gov (United States)

    Burini, D.; De Lillo, S.; Gibelli, L.

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.

  9. Behavioral and social sciences theories and models: are they used in unintentional injury prevention research?

    Science.gov (United States)

    Trifiletti, L B; Gielen, A C; Sleet, D A; Hopkins, K

    2005-06-01

    Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury prevention research. The authors conducted a systematic review to evaluate the published literature from 1980 to 2001 on behavioral and social science theory applications to unintentional injury prevention and control. Electronic database searches in PubMed and PsycINFO identified articles that combined behavioral and social sciences theories and models and injury causes. The authors identified some articles that examined behavioral and social science theories and models and unintentional injury topics, but found that several important theories have never been applied to unintentional injury prevention. Among the articles identified, the PRECEDE PROCEED Model was cited most frequently, followed by the Theory of Reasoned Action/Theory of Planned Behavior and Health Belief Model. When behavioral and social sciences theories and models were applied to unintentional injury topics, they were most frequently used to guide program design, implementation or develop evaluation measures; few examples of theory testing were found. Results suggest that the use of behavioral and social sciences theories and models in unintentional injury prevention research is only marginally represented in the mainstream, peer-reviewed literature. Both the fields of injury prevention and behavioral and social sciences could benefit from greater collaborative research to enhance behavioral approaches to injury control.

  10. Non-integrable quantum field theories as perturbations of certain integrable models

    International Nuclear Information System (INIS)

    Delfino, G.; Simonetti, P.

    1996-03-01

    We approach the study of non-integrable models of two-dimensional quantum field theory as perturbations of the integrable ones. By exploiting the knowledge of the exact S-matrix and Form Factors of the integrable field theories we obtain the first order corrections to the mass ratios, the vacuum energy density and the S-matrix of the non-integrable theories. As interesting applications of the formalism, we study the scaling region of the Ising model in an external magnetic field at T ∼ T c and the scaling region around the minimal model M 2 , τ . For these models, a remarkable agreement is observed between the theoretical predictions and the data extracted by a numerical diagonalization of their Hamiltonian. (author). 41 refs, 9 figs, 1 tab

  11. Classical nucleation theory in the phase-field crystal model.

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  12. Classical nucleation theory in the phase-field crystal model

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  13. The fixed point structure of lattice field theories

    International Nuclear Information System (INIS)

    Baier, R.; Reusch, H.J.; Lang, C.B.

    1989-01-01

    Monte-Carlo renormalization group methods allow to analyze lattice regularized quantum field theories. The properties of the quantized field theory in the continuum may be recovered at a critical point of the lattice model. This requires a study of the phase diagram and the renormalization flow structure of the coupling constants. As an example the authors discuss the results of a recent MCRG investigation of the SU(2) adjoint Higgs model, where they find evidence for the existence of a tricritical point at finite values of the inverse gauge coupling β

  14. Using a matrix-analytical approach to synthesizing evidence solved incompatibility problem in the hierarchy of evidence.

    Science.gov (United States)

    Walach, Harald; Loef, Martin

    2015-11-01

    The hierarchy of evidence presupposes linearity and additivity of effects, as well as commutativity of knowledge structures. It thereby implicitly assumes a classical theoretical model. This is an argumentative article that uses theoretical analysis based on pertinent literature and known facts to examine the standard view of methodology. We show that the assumptions of the hierarchical model are wrong. The knowledge structures gained by various types of studies are not sequentially indifferent, that is, do not commute. External validity and internal validity are at least partially incompatible concepts. Therefore, one needs a different theoretical structure, typical of quantum-type theories, to model this situation. The consequence of this situation is that the implicit assumptions of the hierarchical model are wrong, if generalized to the concept of evidence in total. The problem can be solved by using a matrix-analytical approach to synthesizing evidence. Here, research methods that produce different types of evidence that complement each other are synthesized to yield the full knowledge. We show by an example how this might work. We conclude that the hierarchical model should be complemented by a broader reasoning in methodology. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Soliton excitations in a class of nonlinear field theory models

    International Nuclear Information System (INIS)

    Makhan'kov, V.G.; Fedyanin, V.K.

    1985-01-01

    Investigation results of nonlinear models of the field theory with a lagrangian are described. The theory includes models both with zero stable vacuum epsilon=1 and with condensate epsilon=-1 (of disturbed symmetry). Conditions of existence of particle-like solutions (PLS), stability of these solutions are investigated. Soliton dynamics is studied. PLS formfactors are calculated. Statistical mechanics of solitons is built and their dynamic structure factors are calculated

  16. Magnetic flux tube models in superstring theory

    CERN Document Server

    Russo, Jorge G

    1996-01-01

    Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...

  17. Models of Regge behaviour in an asymptotically free theory

    International Nuclear Information System (INIS)

    Polkinghorne, J.C.

    1976-01-01

    Two simple Feynman integral models are presented which reproduce the features expected to be of physical importance in the Regge behaviour of asymptotically free theories. Analysis confirms the result, expected on general grounds, that phi 3 in six dimensions has an essential singularity at l=-1. The extension to gauge theories is discussed. (Auth.)

  18. A model of theory-practice relations in mathematics teacher education

    DEFF Research Database (Denmark)

    Østergaard, Kaj

    2016-01-01

    The paper presents and discusses an ATD based (Chevallard, 2012) model of theory-practice relations in mathematics teacher education. The notions of didactic transposition and praxeology are combined and concretized in order to form a comprehensive model for analysing the theory......-practice problematique. It is illustrated how the model can be used both as a descriptive tool to analyse interactions between and interviews with student teachers and teachers and as a normative tool to design and redesign learning environments in teacher education in this case a lesson study context....

  19. Towards a Semantic E-Learning Theory by Using a Modelling Approach

    Science.gov (United States)

    Yli-Luoma, Pertti V. J.; Naeve, Ambjorn

    2006-01-01

    In the present study, a semantic perspective on e-learning theory is advanced and a modelling approach is used. This modelling approach towards the new learning theory is based on the four SECI phases of knowledge conversion: Socialisation, Externalisation, Combination and Internalisation, introduced by Nonaka in 1994, and involving two levels of…

  20. The Role of Adolescent Development in Social Networking Site Use: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Drew P. Cingel

    2014-03-01

    Full Text Available Using survey data collected from 260 children, adolescents, and young adults between the ages of 9 and 26, this paper offers evidence for a relationship between social networking site use and Imaginary Audience, a developmental variable in which adolescents believe others are thinking about them at all times. Specifically, after controlling for a number of variables, results indicate a significant, positive relationship between social networking site use and Imaginary Audience ideation. Additionally, results indicate a positive relationship between Imaginary Audience ideation and Facebook customization practices. Together, these findings provide evidence, based on Vygotskian developmental theory, for a general consideration of the role that currently available tools, in this case social networking sites, can have on development. Thus, findings implicate both the role of development on social networking site use, as well as the role of social networking site use on development. Overall, these findings have important implications for the study of media and human development, which are discussed in detail.

  1. Visual perception and interception of falling objects: a review of evidence for an internal model of gravity.

    Science.gov (United States)

    Zago, Myrka; Lacquaniti, Francesco

    2005-09-01

    Prevailing views on how we time the interception of a moving object assume that the visual inputs are informationally sufficient to estimate the time-to-contact from the object's kinematics. However, there are limitations in the visual system that raise questions about the general validity of these theories. Most notably, vision is poorly sensitive to arbitrary accelerations. How then does the brain deal with the motion of objects accelerated by Earth's gravity? Here we review evidence in favor of the view that the brain makes the best estimate about target motion based on visually measured kinematics and an a priori guess about the causes of motion. According to this theory, a predictive model is used to extrapolate time-to-contact from the expected kinetics in the Earth's gravitational field.

  2. Using a Theory-Guided Learning Collaborative Model to Improve Implementation of EBPs in a State Children's Mental Health System: A Pilot Study.

    Science.gov (United States)

    Nadeem, Erum; Weiss, Dara; Olin, S Serene; Hoagwood, Kimberly E; Horwitz, Sarah M

    2016-11-01

    Learning collaboratives (LCs) are used widely to promote implementation of evidence-based practices. However, there has been limited research on the effectiveness of LCs and models vary widely in their structure, focus and components. The goal of the present study was to develop and field test a theory-based LC model to augment a state-led, evidence-based training program for clinicians providing mental health services to children. Analysis of implementation outcomes contrasted LC sites to matched comparison sites that participated in the clinical training program alone. Results suggested that clinicians from sites participating in the LC were more highly engaged in the state-led clinical training program and were more likely to complete program requirements.

  3. "Keeping on track"-Hospital nurses' struggles with maintaining workflow while seeking to integrate evidence-based practice into their daily work: A grounded theory study.

    Science.gov (United States)

    Renolen, Åste; Høye, Sevald; Hjälmhult, Esther; Danbolt, Lars Johan; Kirkevold, Marit

    2018-01-01

    Evidence-based practice is considered a foundation for the provision of quality care and one way to integrate scientific knowledge into clinical problem-solving. Despite the extensive amount of research that has been conducted to evaluate evidence-based practice implementation and research utilization, these practices have not been sufficiently incorporated into nursing practice. Thus, additional research regarding the challenges clinical nurses face when integrating evidence-based practice into their daily work and the manner in which these challenges are approached is needed. The aim of this study was to generate a theory about the general patterns of behaviour that are discovered when clinical nurses attempt to integrate evidence-based practice into their daily work. We used Glaser's classical grounded theory methodology to generate a substantive theory. The study was conducted in two different medical wards in a large Norwegian hospital. In one ward, nurses and nursing assistants were developing and implementing new evidence-based procedures, and in the other ward, evidence-based huddle boards for risk assessment were being implemented. A total of 54 registered nurses and 9 assistant nurses were observed during their patient care and daily activities. Of these individuals, thirteen registered nurses and five assistant nurses participated in focus groups. These participants were selected through theoretical sampling. Data were collected during 90h of observation and 4 focus groups conducted from 2014 to 2015. Each focus group session included four to five participants and lasted between 55 and 65min. Data collection and analysis were performed concurrently, and the data were analysed using the constant comparative method. "Keeping on track" emerged as an explanatory theory for the processes through which the nurses handled their main concern: the risk of losing the workflow. The following three strategies were used by nurses when attempting to integrate evidence

  4. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...

  5. 2PI effective action for the SYK model and tensor field theories

    Science.gov (United States)

    Benedetti, Dario; Gurau, Razvan

    2018-05-01

    We discuss the two-particle irreducible (2PI) effective action for the SYK model and for tensor field theories. For the SYK model the 2PI effective action reproduces the bilocal reformulation of the model without using replicas. In general tensor field theories the 2PI formalism is the only way to obtain a bilocal reformulation of the theory, and as such is a precious instrument for the identification of soft modes and for possible holographic interpretations. We compute the 2PI action for several models, and push it up to fourth order in the 1 /N expansion for the model proposed by Witten in [1], uncovering a one-loop structure in terms of an auxiliary bilocal action.

  6. Great Expectations: Is there Evidence for Predictive Coding in Auditory Cortex?

    Science.gov (United States)

    Heilbron, Micha; Chait, Maria

    2017-08-04

    Predictive coding is possibly one of the most influential, comprehensive, and controversial theories of neural function. While proponents praise its explanatory potential, critics object that key tenets of the theory are untested or even untestable. The present article critically examines existing evidence for predictive coding in the auditory modality. Specifically, we identify five key assumptions of the theory and evaluate each in the light of animal, human and modeling studies of auditory pattern processing. For the first two assumptions - that neural responses are shaped by expectations and that these expectations are hierarchically organized - animal and human studies provide compelling evidence. The anticipatory, predictive nature of these expectations also enjoys empirical support, especially from studies on unexpected stimulus omission. However, for the existence of separate error and prediction neurons, a key assumption of the theory, evidence is lacking. More work exists on the proposed oscillatory signatures of predictive coding, and on the relation between attention and precision. However, results on these latter two assumptions are mixed or contradictory. Looking to the future, more collaboration between human and animal studies, aided by model-based analyses will be needed to test specific assumptions and implementations of predictive coding - and, as such, help determine whether this popular grand theory can fulfill its expectations. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  7. Testing multi-alternative decision models with non-stationary evidence.

    Science.gov (United States)

    Tsetsos, Konstantinos; Usher, Marius; McClelland, James L

    2011-01-01

    Recent research has investigated the process of integrating perceptual evidence toward a decision, converging on a number of sequential sampling choice models, such as variants of race and diffusion models and the non-linear leaky competing accumulator (LCA) model. Here we study extensions of these models to multi-alternative choice, considering how well they can account for data from a psychophysical experiment in which the evidence supporting each of the alternatives changes dynamically during the trial, in a way that creates temporal correlations. We find that participants exhibit a tendency to choose an alternative whose evidence profile is temporally anti-correlated with (or dissimilar from) that of other alternatives. This advantage of the anti-correlated alternative is well accounted for in the LCA, and provides constraints that challenge several other models of multi-alternative choice.

  8. Using circuit theory to model connectivity in ecology, evolution, and conservation.

    Science.gov (United States)

    McRae, Brad H; Dickson, Brett G; Keitt, Timothy H; Shah, Viral B

    2008-10-01

    Connectivity among populations and habitats is important for a wide range of ecological processes. Understanding, preserving, and restoring connectivity in complex landscapes requires connectivity models and metrics that are reliable, efficient, and process based. We introduce a new class of ecological connectivity models based in electrical circuit theory. Although they have been applied in other disciplines, circuit-theoretic connectivity models are new to ecology. They offer distinct advantages over common analytic connectivity models, including a theoretical basis in random walk theory and an ability to evaluate contributions of multiple dispersal pathways. Resistance, current, and voltage calculated across graphs or raster grids can be related to ecological processes (such as individual movement and gene flow) that occur across large population networks or landscapes. Efficient algorithms can quickly solve networks with millions of nodes, or landscapes with millions of raster cells. Here we review basic circuit theory, discuss relationships between circuit and random walk theories, and describe applications in ecology, evolution, and conservation. We provide examples of how circuit models can be used to predict movement patterns and fates of random walkers in complex landscapes and to identify important habitat patches and movement corridors for conservation planning.

  9. Massive IIA string theory and Matrix theory compactification

    International Nuclear Information System (INIS)

    Lowe, David A.; Nastase, Horatiu; Ramgoolam, Sanjaye

    2003-01-01

    We propose a Matrix theory approach to Romans' massive Type IIA supergravity. It is obtained by applying the procedure of Matrix theory compactifications to Hull's proposal of the massive Type IIA string theory as M-theory on a twisted torus. The resulting Matrix theory is a super-Yang-Mills theory on large N three-branes with a space-dependent noncommutativity parameter, which is also independently derived by a T-duality approach. We give evidence showing that the energies of a class of physical excitations of the super-Yang-Mills theory show the correct symmetry expected from massive Type IIA string theory in a lightcone quantization

  10. A Membrane Model from Implicit Elasticity Theory

    Science.gov (United States)

    Freed, A. D.; Liao, J.; Einstein, D. R.

    2014-01-01

    A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079

  11. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  12. Prospect Theory for Online Financial Trading

    Science.gov (United States)

    Liu, Yang-Yu; Nacher, Jose C.; Ochiai, Tomoshiro; Martino, Mauro; Altshuler, Yaniv

    2014-03-01

    Prospect theory is widely viewed as the best available descriptive model of how people evaluate risk in experimental settings. According to prospect theory, people make decisions based on the potential value of losses and gains rather than the final outcome. People are risk-averse with respect to gains and risk-seeking with respect to losses, a phenomenon called ``loss aversion''. Despite of the fact that prospect theory has been well studied in behavioral economics at the theoretical level, there exist very few empirical research and most of them has been undertaken with micro-panel data. Here we analyze the trading activities of over 1.5 million members of an online financial trading community over 28 months, aiming to explore the large-scale empirical aspect of prospect theory. By analyzing and comparing the behaviour of ``winners'' and ``losers'', i.e., traders with positive or negative final net profit, we find clear evidence of the loss aversion phenomenon, an essence in prospect theory. This work demonstrates an unprecedented large-scale empirical evidence of prospect theory. It has immediate implication in financial trading, e.g., developing new trading strategies by minimizing the effect of loss aversion. It also provides opportunity to augment online social trading, where users are allowed to watch and follow the trading activity of others, by predicting potential winners based on their historical trading behaviour.

  13. A Modified Method for Evaluating Sustainable Transport Solutions Based on AHP and Dempster–Shafer Evidence Theory

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2018-04-01

    Full Text Available With the challenge of transportation environment, a large amount of attention is paid to sustainable mobility worldwide, thus bringing the problem of the evaluation of sustainable transport solutions. In this paper, a modified method based on analytical hierarchy process (AHP and Dempster–Shafer evidence theory (D-S theory is proposed for evaluating the impact of transport measures on city sustainability. AHP is adapted to determine the weight of sustainability criteria while D-S theory is used for data fusion of the sustainability assessment. A Transport Sustainability Index (TSI is presented as a primary measure to determine whether transport solutions have a positive impact on city sustainability. A case study of car-sharing is illustrated to show the efficiency of our proposed method. Our modified method has two desirable properties. One is that the BPA is generated with a new modification framework of evaluation levels, which can flexibly manage uncertain information. The other is that the modified method has excellent performance in sensitivity analysis.

  14. Collective learning modeling based on the kinetic theory of active particles.

    Science.gov (United States)

    Burini, D; De Lillo, S; Gibelli, L

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Small numbers in supersymmetric theories of nature

    International Nuclear Information System (INIS)

    Graesser, Michael L.

    1999-01-01

    The Standard Model of particle interactions is a successful theory for describing the interactions of quarks, leptons and gauge bosons at microscopic distance scales. Despite these successes, the theory contains many unsatisfactory features. The origin of particle masses is a central mystery that has eluded experimental elucidation. In the Standard Model the known particles obtain their mass from the condensate of the so-called Higgs particle. Quantum corrections to the Higgs mass require an unnatural fine tuning in the Higgs mass of one part in 10 -32 to obtain the correct mass scale of electroweak physics. In addition, the origin of the vast hierarchy between the mass scales of the electroweak and quantum gravity physics is not explained in the current theory. Supersymmetric extensions to the Standard Model are not plagued by this fine tuning issue and may therefore be relevant in Nature. In the minimal supersymmetric Standard Model there is also a natural explanation for electroweak symmetry breaking. Supersymmetric Grand Unified Theories also correctly predict a parameter of the Standard Model. This provides non-trivial indirect evidence for these theories. The most general supersymmetric extension to the Standard Model however, is excluded by many physical processes, such as rare flavor changing processes, and the non-observation of the instability of the proton. These processes provide important information about the possible structure such a theory. In particular, certain parameters in this theory must be rather small. A physics explanation for why this is the case would be desirable. It is striking that the gauge couplings of the Standard Model unify if there is supersymmetry close to the weak scale. This suggests that at high energies Nature is described by a supersymmetric Grand Unified Theory. But the mass scale of unification must be introduced into the theory since it does not coincide with the probable mass scale of strong quantum gravity. The subject

  16. A Practise-based Theory of SEIDET Smart Community Centre Model

    CSIR Research Space (South Africa)

    Phahlamohlaka, J

    2015-11-01

    Full Text Available , as it is designed using the international studies and theories. This paper presents the design of the smart community centre model. The design is described using Practice Theory concepts towards an empirical study that will be conducted using the General...

  17. Flipped classroom model for learning evidence-based medicine

    Directory of Open Access Journals (Sweden)

    Rucker SY

    2017-08-01

    Full Text Available Sydney Y Rucker,1 Zulfukar Ozdogan,1 Morhaf Al Achkar2 1School of Education, Indiana University, Bloomington, IN, 2Department of Family Medicine, School of Medicine, University of Washington, Seattle, WA, USA Abstract: Journal club (JC, as a pedagogical strategy, has long been used in graduate medical education (GME. As evidence-based medicine (EBM becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice. Keywords: evidence-based medicine, flipped classroom, residency education

  18. A Novel Evidence Theory and Fuzzy Preference Approach-Based Multi-Sensor Data Fusion Technique for Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2017-10-01

    Full Text Available The multi-sensor data fusion technique plays a significant role in fault diagnosis and in a variety of such applications, and the Dempster–Shafer evidence theory is employed to improve the system performance; whereas, it may generate a counter-intuitive result when the pieces of evidence highly conflict with each other. To handle this problem, a novel multi-sensor data fusion approach on the basis of the distance of evidence, belief entropy and fuzzy preference relation analysis is proposed. A function of evidence distance is first leveraged to measure the conflict degree among the pieces of evidence; thus, the support degree can be obtained to represent the reliability of the evidence. Next, the uncertainty of each piece of evidence is measured by means of the belief entropy. Based on the quantitative uncertainty measured above, the fuzzy preference relations are applied to represent the relative credibility preference of the evidence. Afterwards, the support degree of each piece of evidence is adjusted by taking advantage of the relative credibility preference of the evidence that can be utilized to generate an appropriate weight with respect to each piece of evidence. Finally, the modified weights of the evidence are adopted to adjust the bodies of the evidence in the advance of utilizing Dempster’s combination rule. A numerical example and a practical application in fault diagnosis are used as illustrations to demonstrate that the proposal is reasonable and efficient in the management of conflict and fault diagnosis.

  19. A system-theory-based model for monthly river runoff forecasting: model calibration and optimization

    Directory of Open Access Journals (Sweden)

    Wu Jianhua

    2014-03-01

    Full Text Available River runoff is not only a crucial part of the global water cycle, but it is also an important source for hydropower and an essential element of water balance. This study presents a system-theory-based model for river runoff forecasting taking the Hailiutu River as a case study. The forecasting model, designed for the Hailiutu watershed, was calibrated and verified by long-term precipitation observation data and groundwater exploitation data from the study area. Additionally, frequency analysis, taken as an optimization technique, was applied to improve prediction accuracy. Following model optimization, the overall relative prediction errors are below 10%. The system-theory-based prediction model is applicable to river runoff forecasting, and following optimization by frequency analysis, the prediction error is acceptable.

  20. Perturbation theory instead of large scale shell model calculations

    International Nuclear Information System (INIS)

    Feldmeier, H.; Mankos, P.

    1977-01-01

    Results of large scale shell model calculations for (sd)-shell nuclei are compared with a perturbation theory provides an excellent approximation when the SU(3)-basis is used as a starting point. The results indicate that perturbation theory treatment in an SU(3)-basis including 2hω excitations should be preferable to a full diagonalization within the (sd)-shell. (orig.) [de

  1. Effective-field-theory model for the fractional quantum Hall effect

    International Nuclear Information System (INIS)

    Zhang, S.C.; Hansson, T.H.; Kivelson, S.

    1989-01-01

    Starting directly from the microscopic Hamiltonian, we derive a field-theory model for the fractional quantum hall effect. By considering an approximate coarse-grained version of the same model, we construct a Landau-Ginzburg theory similar to that of Girvin. The partition function of the model exhibits cusps as a function of density and the Hall conductance is quantized at filling factors ν = (2k-1)/sup -1/ with k an arbitrary integer. At these fractions the ground state is incompressible, and the quasiparticles and quasiholes have fractional charge and obey fractional statistics. Finally, we show that the collective density fluctuations are massive

  2. Models for Theory-Based M.A. and Ph.D. Programs.

    Science.gov (United States)

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  3. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  4. Informed Systems: Enabling Collaborative Evidence Based Organizational Learning

    Directory of Open Access Journals (Sweden)

    Mary M. Somerville

    2015-12-01

    Full Text Available Objective – In response to unrelenting disruptions in academic publishing and higher education ecosystems, the Informed Systems approach supports evidence based professional activities to make decisions and take actions. This conceptual paper presents two core models, Informed Systems Leadership Model and Collaborative Evidence-Based Information Process Model, whereby co-workers learn to make informed decisions by identifying the decisions to be made and the information required for those decisions. This is accomplished through collaborative design and iterative evaluation of workplace systems, relationships, and practices. Over time, increasingly effective and efficient structures and processes for using information to learn further organizational renewal and advance nimble responsiveness amidst dynamically changing circumstances. Methods – The integrated Informed Systems approach to fostering persistent workplace inquiry has its genesis in three theories that together activate and enable robust information usage and organizational learning. The information- and learning-intensive theories of Peter Checkland in England, which advance systems design, stimulate participants’ appreciation during the design process of the potential for using information to learn. Within a co-designed environment, intentional social practices continue workplace learning, described by Christine Bruce in Australia as informed learning enacted through information experiences. In addition, in Japan, Ikujiro Nonaka’s theories foster information exchange processes and knowledge creation activities within and across organizational units. In combination, these theories promote the kind of learning made possible through evolving and transferable capacity to use information to learn through design and usage of collaborative communication systems with associated professional practices. Informed Systems therein draws from three antecedent theories to create an original

  5. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  6. Theory to practice: the humanbecoming leading-following model.

    Science.gov (United States)

    Ursel, Karen L

    2015-01-01

    Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.

  7. Agent-based models for higher-order theory of mind

    NARCIS (Netherlands)

    de Weerd, Harmen; Verbrugge, Rineke; Verheij, Bart; Kamiński, Bogumił; Koloch, Grzegorz

    2014-01-01

    Agent-based models are a powerful tool for explaining the emergence of social phenomena in a society. In such models, individual agents typically have little cognitive ability. In this paper, we model agents with the cognitive ability to make use of theory of mind. People use this ability to reason

  8. A critical assessment of theories/models used in health communication for HIV/AIDS.

    Science.gov (United States)

    Airhihenbuwa, C O; Obregon, R

    2000-01-01

    Most theories and models used to develop human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) communication are based on social psychology that emphasizes individualism. Researchers including communication and health scholars are now questioning the presumed global relevance of these models and thus the need to develop innovative theories and models that take into account regional contexts. In this paper, we discuss the commonly used theories and models in HIV/AIDS communication. Furthermore, we argue that the flaws in the application of the commonly used "classical" models in health communication are because of contextual differences in locations where these models are applied. That is to say that these theories and models are being applied in contexts for which they were not designed. For example, the differences in health behaviors are often the function of culture. Therefore, culture should be viewed for its strength and not always as a barrier. The metaphorical coupling of "culture" and "barrier" needs to be exposed, deconstructed, and reconstructed so that new, positive, cultural linkages can be forged. The HIV/AIDS pandemic has served as a flashpoint to either highlight the importance or deny the relevance of theories and models while at the same time addressing the importance of culture in the development and implementation of communication programs.

  9. Scaling theory of depinning in the Sneppen model

    International Nuclear Information System (INIS)

    Maslov, S.; Paczuski, M.

    1994-01-01

    We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. 69, 3539 (1992)]. This theory is based on a ''gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, ν parallel (d) and ν perpendicular (d), characterizing the divergence of the parallel and perpendicular correlation lengths as the interface approaches its dynamical attractor

  10. PARFUME Theory and Model basis Report

    Energy Technology Data Exchange (ETDEWEB)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  11. Ottawa Model of Implementation Leadership and Implementation Leadership Scale: mapping concepts for developing and evaluating theory-based leadership interventions

    Directory of Open Access Journals (Sweden)

    Gifford W

    2017-03-01

    Full Text Available Wendy Gifford,1 Ian D Graham,2,3 Mark G Ehrhart,4 Barbara L Davies,5,6 Gregory A Aarons7 1School of Nursing, Faculty of Health Sciences, University of Ottawa, ON, Canada; 2Centre for Practice-Changing Research, Ottawa Hospital Research Institute, 3School of Epidemiology, Public Health and Preventive Medicine, Facility of Medicine, University of Ottawa, Ottawa, ON, Canada; 4Department of Psychology, San Diego State University, San Diego, CA, USA; 5Nursing Best Practice Research Center, University of Ottawa, Ottawa, ON, Canada; 6Department of Psychiatry, University of California, San Diego, La Jolla, CA, USA; 7Child and Adolescent Services Research Center, University of California, San Diego, CA, USA Purpose: Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe, a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS, an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions.Methods: Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5 appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached.Results: All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs.Conclusion: The O

  12. A practitioner's guide to persuasion: an overview of 15 selected persuasion theories, models and frameworks.

    Science.gov (United States)

    Cameron, Kenzie A

    2009-03-01

    To provide a brief overview of 15 selected persuasion theories and models, and to present examples of their use in health communication research. The theories are categorized as message effects models, attitude-behavior approaches, cognitive processing theories and models, consistency theories, inoculation theory, and functional approaches. As it is often the intent of a practitioner to shape, reinforce, or change a patient's behavior, familiarity with theories of persuasion may lead to the development of novel communication approaches with existing patients. This article serves as an introductory primer to theories of persuasion with applications to health communication research. Understanding key constructs and general formulations of persuasive theories may allow practitioners to employ useful theoretical frameworks when interacting with patients.

  13. The heuristic-analytic theory of reasoning: extension and evaluation.

    Science.gov (United States)

    Evans, Jonathan St B T

    2006-06-01

    An extensively revised heuristic-analytic theory of reasoning is presented incorporating three principles of hypothetical thinking. The theory assumes that reasoning and judgment are facilitated by the formation of epistemic mental models that are generated one at a time (singularity principle) by preconscious heuristic processes that contextualize problems in such a way as to maximize relevance to current goals (relevance principle). Analytic processes evaluate these models but tend to accept them unless there is good reason to reject them (satisficing principle). At a minimum, analytic processing of models is required so as to generate inferences or judgments relevant to the task instructions, but more active intervention may result in modification or replacement of default models generated by the heuristic system. Evidence for this theory is provided by a review of a wide range of literature on thinking and reasoning.

  14. Prospects for advanced RF theory and modeling

    International Nuclear Information System (INIS)

    Batchelor, D. B.

    1999-01-01

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed. (c) 1999 American Institute of Physics

  15. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    International Nuclear Information System (INIS)

    Saraswati, Teguh Endah; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH 3 ). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory. (paper)

  16. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    Science.gov (United States)

    Endah Saraswati, Teguh; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH3). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory.

  17. Phase Structure Of Fuzzy Field Theories And Multi trace Matrix Models

    International Nuclear Information System (INIS)

    Tekel, J.

    2015-01-01

    We review the interplay of fuzzy field theories and matrix models, with an emphasis on the phase structure of fuzzy scalar field theories. We give a self-contained introduction to these topics and give the details concerning the saddle point approach for the usual single trace and multi trace matrix models. We then review the attempts to explain the phase structure of the fuzzy field theory using a corresponding random matrix ensemble, showing the strength and weaknesses of this approach. We conclude with a list of challenges one needs to overcome and the most interesting open problems one can try to solve. (author)

  18. A Dynamic Systems Theory Model of Visual Perception Development

    Science.gov (United States)

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  19. Improving Accuracy of Dempster-Shafer Theory Based Anomaly Detection Systems

    Directory of Open Access Journals (Sweden)

    Ling Zou

    2014-07-01

    Full Text Available While the Dempster-Shafer theory of evidence has been widely used in anomaly detection, there are some issues with them. Dempster-Shafer theory of evidence trusts evidences equally which does not hold in distributed-sensor ADS. Moreover, evidences are dependent with each other sometimes which will lead to false alert. We propose improving by incorporating two algorithms. Features selection algorithm employs Gaussian Graphical Models to discover correlation between some candidate features. A group of suitable ADS were selected to detect and detection result were send to the fusion engine. Information gain is applied to set weight for every feature on Weights estimated algorithm. A weighted Dempster-Shafer theory of evidence combined the detection results to achieve a better accuracy. We evaluate our detection prototype through a set of experiments that were conducted with standard benchmark Wisconsin Breast Cancer Dataset and real Internet traffic. Evaluations on the Wisconsin Breast Cancer Dataset show that our prototype can find the correlation in nine features and improve the detection rate without affecting the false positive rate. Evaluations on Internet traffic show that Weights estimated algorithm can improve the detection performance significantly.

  20. Johnson-Laird's mental models theory and its principles: an application with cell mental models of high school students

    OpenAIRE

    Mª Luz Rodríguez Palmero; Javier Marrero Acosta; Marco Antonio Moreira

    2001-01-01

    Following a discussion of Johnson-Laird's mental models theory, we report a study regarding high school students mental representations of cell, understood as mental models. Research findings suggest the appropriatedness of such a theory as a framework to interpret students' representations.

  1. Causal Agency Theory: Reconceptualizing a Functional Model of Self-Determination

    Science.gov (United States)

    Shogren, Karrie A.; Wehmeyer, Michael L.; Palmer, Susan B.; Forber-Pratt, Anjali J.; Little, Todd J.; Lopez, Shane

    2015-01-01

    This paper introduces Causal Agency Theory, an extension of the functional model of self-determination. Causal Agency Theory addresses the need for interventions and assessments pertaining to selfdetermination for all students and incorporates the significant advances in understanding of disability and in the field of positive psychology since the…

  2. A general-model-space diagrammatic perturbation theory

    International Nuclear Information System (INIS)

    Hose, G.; Kaldor, U.

    1980-01-01

    A diagrammatic many-body perturbation theory applicable to arbitrary model spaces is presented. The necessity of having a complete model space (all possible occupancies of the partially-filled shells) is avoided. This requirement may be troublesome for systems with several well-spaced open shells, such as most atomic and molecular excited states, as a complete model space spans a very broad energy range and leaves out states within that range, leading to poor or no convergence of the perturbation series. The method presented here would be particularly useful for such states. The solution of a model problem (He 2 excited Σ + sub(g) states) is demonstrated. (Auth.)

  3. Plane symmetric cosmological micro model in modified theory of Einstein’s general relativity

    Directory of Open Access Journals (Sweden)

    Panigrahi U.K.

    2003-01-01

    Full Text Available In this paper, we have investigated an anisotropic homogeneous plane symmetric cosmological micro-model in the presence of massless scalar field in modified theory of Einstein's general relativity. Some interesting physical and geometrical aspects of the model together with singularity in the model are discussed. Further, it is shown that this theory is valid and leads to Ein­stein's theory as the coupling parameter λ →>• 0 in micro (i.e. quantum level in general.

  4. Location Decisions of U.S. Polluting Plants. Theory, Empirical Evidence, and Consequences

    International Nuclear Information System (INIS)

    Shadbegian, R.; Wolverton, A.

    2010-01-01

    Economists have long been interested in explaining the spatial distribution of economic activity, focusing on what factors motivate profit-maximizing firms when they choose to open a new plant or expand an existing facility. We begin our paper with a general discussion of the theory of plant location, including the role of taxes and agglomeration economies. However, our paper focuses on the theory, evidence, and implications of the role of environmental regulations in plant location decisions. On its face, environmental regulation would not necessarily be expected to alter location decisions, since we would expect Federal regulation to affect all locations in the United States essentially equally. It turns out, however, that this is not always the case as some geographic areas are subject to greater stringency. Another source of variation is differences across states in the way they implement and enforce compliance with Federal regulation. In light of these spatial differences in the costs of complying with environmental regulations, we discuss three main questions in this survey: Do environmental regulations affect the location decisions of polluting plants? Do states compete for polluting plants through differences in environmental regulation? And, do firms locate polluting plants disproportionately near poor and minority neighborhoods?.

  5. Location Decisions of U.S. Polluting Plants. Theory, Empirical Evidence, and Consequences

    Energy Technology Data Exchange (ETDEWEB)

    Shadbegian, R.; Wolverton, A.

    2010-06-15

    Economists have long been interested in explaining the spatial distribution of economic activity, focusing on what factors motivate profit-maximizing firms when they choose to open a new plant or expand an existing facility. We begin our paper with a general discussion of the theory of plant location, including the role of taxes and agglomeration economies. However, our paper focuses on the theory, evidence, and implications of the role of environmental regulations in plant location decisions. On its face, environmental regulation would not necessarily be expected to alter location decisions, since we would expect Federal regulation to affect all locations in the United States essentially equally. It turns out, however, that this is not always the case as some geographic areas are subject to greater stringency. Another source of variation is differences across states in the way they implement and enforce compliance with Federal regulation. In light of these spatial differences in the costs of complying with environmental regulations, we discuss three main questions in this survey: Do environmental regulations affect the location decisions of polluting plants? Do states compete for polluting plants through differences in environmental regulation? And, do firms locate polluting plants disproportionately near poor and minority neighborhoods?.

  6. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  7. Nonlinear structural mechanics theory, dynamical phenomena and modeling

    CERN Document Server

    Lacarbonara, Walter

    2013-01-01

    Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...

  8. Thirty Years of Prospect Theory in Economics: A Review and Assessment

    OpenAIRE

    Nicholas C. Barberis

    2013-01-01

    In 1979, Daniel Kahneman and Amos Tversky, published a paper in Econometrica titled "Prospect Theory: An Analysis of Decision under Risk." The paper presented a new model of risk attitudes called "prospect theory," which elegantly captured the experimental evidence on risk taking, including the documented violations of expected utility. More than 30 years later, prospect theory is still widely viewed as the best available description of how people evaluate risk in experimental settings. Howev...

  9. Evidence-based hypnotherapy for depression.

    Science.gov (United States)

    Alladin, Assen

    2010-04-01

    Cognitive hypnotherapy (CH) is a comprehensive evidence-based hypnotherapy for clinical depression. This article describes the major components of CH, which integrate hypnosis with cognitive-behavior therapy as the latter provides an effective host theory for the assimilation of empirically supported treatment techniques derived from various theoretical models of psychotherapy and psychopathology. CH meets criteria for an assimilative model of psychotherapy, which is considered to be an efficacious model of psychotherapy integration. The major components of CH for depression are described in sufficient detail to allow replication, verification, and validation of the techniques delineated. CH for depression provides a template that clinicians and investigators can utilize to study the additive effects of hypnosis in the management of other psychological or medical disorders. Evidence-based hypnotherapy and research are encouraged; such a movement is necessary if clinical hypnosis is to integrate into mainstream psychotherapy.

  10. Quantum integrable models of field theory

    International Nuclear Information System (INIS)

    Faddeev, L.D.

    1979-01-01

    Fundamental features of the classical method of the inverse problem have been formulated in the form which is convenient for its quantum reformulation. Typical examples are studied which may help to formulate the quantum method of the inverse problem. Examples are considered for interaction with both attraction and repulsion at a final density. The sine-Gordon model and the XYZ model from the quantum theory of magnetics are examined in short. It is noted that all the achievements of the one-dimensional mathematical physics as applied to exactly solvable quantum models may be put to an extent within the framework of the quantum method of the inverse problem. Unsolved questions are enumerated and perspectives of applying the inverse problem method are shown

  11. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.

    1986-01-01

    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions....

  12. A conceptual framework for organismal biology: linking theories, models, and data.

    Science.gov (United States)

    Zamer, William E; Scheiner, Samuel M

    2014-11-01

    Implicit or subconscious theory is especially common in the biological sciences. Yet, theory plays a variety of roles in scientific inquiry. First and foremost, it determines what does and does not count as a valid or interesting question or line of inquiry. Second, theory determines the background assumptions within which inquiries are pursued. Third, theory provides linkages among disciplines. For these reasons, it is important and useful to develop explicit theories for biology. A general theory of organisms is developed, which includes 10 fundamental principles that apply to all organisms, and 6 that apply to multicellular organisms only. The value of a general theory comes from its utility to help guide the development of more specific theories and models. That process is demonstrated by examining two domains: ecoimmunology and development. For the former, a constitutive theory of ecoimmunology is presented, and used to develop a specific model that explains energetic trade-offs that may result from an immunological response of a host to a pathogen. For the latter, some of the issues involved in trying to devise a constitutive theory that covers all of development are explored, and a more narrow theory of phenotypic novelty is presented. By its very nature, little of a theory of organisms will be new. Rather, the theory presented here is a formal expression of nearly two centuries of conceptual advances and practice in research. Any theory is dynamic and subject to debate and change. Such debate will occur as part of the present, initial formulation, as the ideas presented here are refined. The very process of debating the form of the theory acts to clarify thinking. The overarching goal is to stimulate debate about the role of theory in the study of organisms, and thereby advance our understanding of them. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology 2014. This work is written by US Government employees

  13. Group theory for unified model building

    International Nuclear Information System (INIS)

    Slansky, R.

    1981-01-01

    The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)

  14. Large N field theories, string theory and gravity

    Energy Technology Data Exchange (ETDEWEB)

    Maldacena, J [Lyman Laboratory of Physics, Harvard University, Cambridge (United States)

    2002-05-15

    We describe the holographic correspondence between field theories and string/M theory, focusing on the relation between compactifications of string/ M theory on Anti-de Sitter spaces and conformal field theories. We review the background for this correspondence and discuss its motivations and the evidence for its correctness. We describe the main results that have been derived from the correspondence in the regime that the field theory is approximated by classical or semiclassical gravity. We focus on the case of the N = 4 supersymmetric gauge theory in four dimensions. These lecture notes are based on the Review written by O. Aharony, S. Gubser, J. Maldacena, H. Ooguri and Y. Oz. (author)

  15. δ expansion for local gauge theories. I. A one-dimensional model

    International Nuclear Information System (INIS)

    Bender, C.M.; Cooper, F.; Milton, K.A.; Moshe, M.; Pinsky, S.S.; Simmons, L.M. Jr.

    1992-01-01

    The principles of the δ perturbation theory were first proposed in the context of self-interacting scalar quantum field theory. There it was shown how to expand a (φ 2 ) 1+δ theory as a series in powers of δ and how to recover nonperturbative information about a φ 4 field theory from the δ expansion at δ=1. The purpose of this series of papers is to extend the notions of δ perturbation theory from boson theories to theories having a local gauge symmetry. In the case of quantum electrodynamics one introduces the parameter δ by generalizing the minimal coupling terms to bar ψ(∂-ieA) δ ψ and expanding in powers of δ. This interaction preserves local gauge invariance for all δ. While there are enormous benefits in using the δ expansion (obtaining nonperturbative results), gauge theories present new technical difficulties not encountered in self-interacting boson theories because the expression (∂-ieA) δ contains a derivative operator. In the first paper of this series a one-dimensional model whose interaction term has the form bar ψ[d/dt-igφ(t)] δ ψ is considered. The virtue of this model is that it provides a laboratory in which to study fractional powers of derivative operators without the added complexity of γ matrices. In the next paper of this series we consider two-dimensional electrodynamics and show how to calculate the anomaly in the δ expansion

  16. Intervention mapping: a process for developing theory- and evidence-based health education programs.

    Science.gov (United States)

    Bartholomew, L K; Parcel, G S; Kok, G

    1998-10-01

    The practice of health education involves three major program-planning activities: needs assessment, program development, and evaluation. Over the past 20 years, significant enhancements have been made to the conceptual base and practice of health education. Models that outline explicit procedures and detailed conceptualization of community assessment and evaluation have been developed. Other advancements include the application of theory to health education and promotion program development and implementation. However, there remains a need for more explicit specification of the processes by which one uses theory and empirical findings to develop interventions. This article presents the origins, purpose, and description of Intervention Mapping, a framework for health education intervention development. Intervention Mapping is composed of five steps: (1) creating a matrix of proximal program objectives, (2) selecting theory-based intervention methods and practical strategies, (3) designing and organizing a program, (4) specifying adoption and implementation plans, and (5) generating program evaluation plans.

  17. Adapting evidence-based interventions using a common theory, practices, and principles.

    Science.gov (United States)

    Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D

    2014-01-01

    Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed.

  18. Matrix models from localization of five-dimensional supersymmetric noncommutative U(1) gauge theory

    International Nuclear Information System (INIS)

    Lee, Bum-Hoon; Ro, Daeho; Yang, Hyun Seok

    2017-01-01

    We study localization of five-dimensional supersymmetric U(1) gauge theory on S 3 ×ℝ θ 2 where ℝ θ 2 is a noncommutative (NC) plane. The theory can be isomorphically mapped to three-dimensional supersymmetric U(N→∞) gauge theory on S 3 using the matrix representation on a separable Hilbert space on which NC fields linearly act. Therefore the NC space ℝ θ 2 allows for a flexible path to derive matrix models via localization from a higher-dimensional supersymmetric NC U(1) gauge theory. The result shows a rich duality between NC U(1) gauge theories and large N matrix models in various dimensions.

  19. Flipped classroom model for learning evidence-based medicine.

    Science.gov (United States)

    Rucker, Sydney Y; Ozdogan, Zulfukar; Al Achkar, Morhaf

    2017-01-01

    Journal club (JC), as a pedagogical strategy, has long been used in graduate medical education (GME). As evidence-based medicine (EBM) becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice.

  20. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  1. Multilevel Higher-Order Item Response Theory Models

    Science.gov (United States)

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  2. Conformal field theories, Coulomb gas picture and integrable models

    International Nuclear Information System (INIS)

    Zuber, J.B.

    1988-01-01

    The aim of the study is to present the links between some results of conformal field theory, the conventional Coulomb gas picture in statistical mechanics and the approach of integrable models. It is shown that families of conformal theories, related by the coset construction to the SU(2) Kac-Moody algebra, may be regarded as obtained from some free field, and modified by the coupling of its winding numbers to floating charges. This representation reflects the procedure of restriction of the corresponding integrable lattice models. The work may be generalized to models based on the coset construction with higher rank algebras. The corresponding integrable models are identified. In the conformal field description, generalized parafermions appear, and are coupled to free fields living on a higher-dimensional torus. The analysis is not as exhaustive as in the SU(2) case: all the various restrictions have not been identified, nor the modular invariants completely classified

  3. A game theory-based trust measurement model for social networks.

    Science.gov (United States)

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  4. A review of the evidence regarding associations between attachment theory and experimentally induced pain.

    Science.gov (United States)

    Meredith, Pamela Joy

    2013-04-01

    Theoretical and empirical evidence suggests that adult attachment and pain-related variables are predictably and consistently linked, and that understanding these links may guide pain intervention and prevention efforts. In general, insecure attachment has been portrayed as a risk factor, and secure attachment as a protective factor, for people with chronic pain conditions. In an effort to better understand the relationships among attachment and pain variables, these links have been investigated in pain-free samples using induced-pain techniques. The present paper reviews the available research linking adult attachment and laboratory-induced pain. While the diverse nature of the studies precludes definitive conclusions, together these papers offer support for associations between insecure attachment and a more negative pain experience. The evidence presented in this review highlights areas for further empirical attention, as well as providing some guidance for clinicians who may wish to employ preventive approaches and other interventions informed by attachment theory.

  5. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection

    Science.gov (United States)

    Reason, Robert D.; Kimball, Ezekiel W.

    2012-01-01

    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  6. Job demands-resources theory: Taking stock and looking forward.

    Science.gov (United States)

    Bakker, Arnold B; Demerouti, Evangelia

    2017-07-01

    The job demands-resources (JD-R) model was introduced in the international literature 15 years ago (Demerouti, Bakker, Nachreiner, & Schaufeli, 2001). The model has been applied in thousands of organizations and has inspired hundreds of empirical articles, including 1 of the most downloaded articles of the Journal of Occupational Health Psychology (Bakker, Demerouti, & Euwema, 2005). This article provides evidence for the buffering role of various job resources on the impact of various job demands on burnout. In the present article, we look back on the first 10 years of the JD-R model (2001-2010), and discuss how the model matured into JD-R theory (2011-2016). Moreover, we look at the future of the theory and outline which new issues in JD-R theory are worthwhile of investigation. We also discuss practical applications. It is our hope that JD-R theory will continue to inspire researchers and practitioners who want to promote employee well-being and effective organizational functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Modeling and Theories of Pathophysiology and Physiology of the Basal Ganglia–Thalamic–Cortical System: Critical Analysis

    Science.gov (United States)

    Montgomery Jr., Erwin B.

    2016-01-01

    Theories impact the movement disorders clinic, not only affecting the development of new therapies but determining how current therapies are used. Models are theories that are procedural rather than declarative. Theories and models are important because, as argued by Kant, one cannot know the thing-in-itself (das Ding an sich) and only a model is knowable. Further, biological variability forces higher level abstraction relevant for all variants. It is that abstraction that is raison d’être of theories and models. Theories “connect the dots” to move from correlation to causation. The necessity of theory makes theories helpful or counterproductive. Theories and models of the pathophysiology and physiology of the basal ganglia–thalamic–cortical system do not spontaneously arise but have a history and consequently are legacies. Over the last 40 years, numerous theories and models of the basal ganglia have been proposed only to be forgotten or dismissed, rarely critiqued. It is not harsh to say that current popular theories positing increased neuronal activities in the Globus Pallidus Interna (GPi), excessive beta oscillations and increased synchronization not only fail to provide an adequate explication but are inconsistent with many observations. It is likely that their shared intellectual and epistemic inheritance plays a factor in their shared failures. These issues are critically examined. How one is to derive theories and models and have hope these will be better is explored as well. PMID:27708569

  8. Using psychological theory and qualitative methods to develop a new evidence-based website about acupuncture for back pain.

    Science.gov (United States)

    Bishop, Felicity L; Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; Ryves, Rachel; White, Peter; Yardley, Lucy

    2016-08-01

    Potential acupuncture patients seek out information about acupuncture from various sources including websites, many of which are unreliable. We aimed to create an informative, scientifically accurate and engaging website to educate patients about acupuncture for back pain and modify their beliefs in a way that might enhance its clinical effects. We used psychological theory and techniques to design an evidence-based website, incorporating multimedia elements. We conducted qualitative "think aloud" audio-recorded interviews to elicit user views of the website. A convenience sample of ten participants (4 male; aged 21-64 years from the local community) looked at the website in the presence of a researcher and spoke their thoughts out loud. Comments were categorised by topic. The website comprises 11 main pages and addresses key topics of interest to potential acupuncture patients, including beneficial and adverse effects, mechanisms of action, safety, practicalities, and patients' experiences of acupuncture. It provides information through text, evidence summaries and audio-clips of four patients' stories and two acupuncturists' descriptions of their practice, and three short films. Evidence from the think aloud study was used to identify opportunities to make the website more informative, engaging, and user-friendly. Using a combination of psychological theory and qualitative interviews enabled us to produce a user-friendly, evidence-based website that is likely to change patients' beliefs about acupuncture for back pain. Before using the website in clinical settings it is necessary to test its effects on key outcomes including patients' beliefs and capacity for making informed choices about acupuncture.

  9. [Neuropsychological models of autism spectrum disorders - behavioral evidence and functional imaging].

    Science.gov (United States)

    Dziobek, Isabel; Bölte, Sven

    2011-03-01

    To review neuropsychological models of theory of mind (ToM), executive functions (EF), and central coherence (CC) as framework for cognitive abnormalities in autism spectrum disorders (ASD). Behavioral and functional imaging studies are described that assess social-cognitive, emotional, and executive functions as well as locally oriented perception in ASD. Impairments in ToM and EF as well as alterations in CC are frequently replicated phenomena in ASD. Especially problems concerning social perception and ToM have high explanatory value for clinical symptomatology. Brain activation patterns differ between individuals with and without ASD for ToM, EF, und CC functions. An approach focussing on reduced cortical connectivity seems to be increasingly favored over explanations focussing on single affected brain sites. A better understanding of the complexities of ASD in future research demands the integration of clinical, neuropsychological, functional imaging, and molecular genetics evidence. Weaknesses in ToM and EF as well as strengths in detail-focussed perception should be used for individual intervention planning.

  10. Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons.

    Science.gov (United States)

    Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang

    2017-07-06

    Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.

  11. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  12. Two problems from the theory of semiotic control models. I. Representations of semiotic models

    Energy Technology Data Exchange (ETDEWEB)

    Osipov, G S

    1981-11-01

    Two problems from the theory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of themtheory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of them. Algebraic representation of semiotic models, covering of representations, their reduction and equivalence are discussed. The interrelations between functional and structural characteristics of semiotic models are investigated. 20 references.

  13. Hydrogen Balmer alpha intensity distributions and line profiles from multiple scattering theory using realistic geocoronal models

    Science.gov (United States)

    Anderson, D. E., Jr.; Meier, R. R.; Hodges, R. R., Jr.; Tinsley, B. A.

    1987-01-01

    The H Balmer alpha nightglow is investigated by using Monte Carlo models of asymmetric geocoronal atomic hydrogen distributions as input to a radiative transfer model of solar Lyman-beta radiation in the thermosphere and atmosphere. It is shown that it is essential to include multiple scattering of Lyman-beta radiation in the interpretation of Balmer alpha airglow data. Observations of diurnal variation in the Balmer alpha airglow showing slightly greater intensities in the morning relative to evening are consistent with theory. No evidence is found for anything other than a single sinusoidal diurnal variation of exobase density. Dramatic changes in effective temperature derived from the observed Balmer alpha line profiles are expected on the basis of changing illumination conditions in the thermosphere and exosphere as different regions of the sky are scanned.

  14. Theory and theory-based models for the pedestal, edge stability and ELMs in tokamaks

    International Nuclear Information System (INIS)

    Guzdar, P.N.; Mahajan, S.M.; Yoshida, Z.; Dorland, W.; Rogers, B.N.; Bateman, G.; Kritz, A.H.; Pankin, A.; Voitsekhovitch, I.; Onjun, T.; Snyder, S.

    2005-01-01

    Theories for equilibrium and stability of H-modes, and models for use within integrated modeling codes with the objective of predicting the height, width and shape of the pedestal at the edge of H-mode plasmas in tokamaks, as well as the onset and frequency of Edge Localized Modes (ELMs), are developed. A theory model for relaxed plasma states with flow, which uses two-fluid Hall-MHD equations, predicts that the natural scale length of the pedestal is the ion skin depth and the pedestal width is larger than the ion poloidal gyro-radius, in agreement with experimental observations. Computations with the GS2 code are used to identify micro-instabilities, such as electron drift waves, that survive the strong flow shear, diamagnetic flows, and magnetic shear that are characteristic of the pedestal. Other instabilities on the pedestal and gyro-radius scale, such as the Kelvin-Helmholtz instability, are also investigated. Time-dependent integrated modeling simulations are used to follow the transition from L-mode to H-mode and the subsequent evolution of ELMs as the heating power is increased. The flow shear stabilization that produces the transport barrier at the edge of the plasma reduces different modes of anomalous transport and, consequently, different channels of transport at different rates. ELM crashes are triggered in the model by pressure-driven ballooning modes or by current-driven peeling modes. (author)

  15. Functional techniques in quantum field theory and two-dimensional models

    International Nuclear Information System (INIS)

    Souza, C. Farina de.

    1985-03-01

    Functional methods applied to Quantum Field Theory are studied. It is shown how to construct the Generating Functional using three of the most important methods existent in the literature, due to Feynman, Symanzik and Schwinger. The Axial Anomaly is discussed in the usual way, and a non perturbative method due to Fujikawa to obtain this anomaly in the path integral formalism is presented. The ''Roskies-Shaposnik-Fujikawa's method'', which makes use of Fujikawa's original idea to solve bidimensional models, is introduced in the Schwinger's model, which, in turn, is applied to obtain the exact solution of the axial model. It is discussed briefly how different regularization procedures can affect the theory in question. (author)

  16. Mean field theory of nuclei and shell model. Present status and future outlook

    International Nuclear Information System (INIS)

    Nakada, Hitoshi

    2003-01-01

    Many of the recent topics of the nuclear structure are concerned on the problems of unstable nuclei. It has been revealed experimentally that the nuclear halos and the neutron skins as well as the cluster structures or the molecule-like structures can be present in the unstable nuclei, and the magic numbers well established in the stable nuclei disappear occasionally while new ones appear. The shell model based on the mean field approximation has been successfully applied to stable nuclei to explain the nuclear structure as the finite many body system quantitatively and it is considered as the standard model at present. If the unstable nuclei will be understood on the same model basis or not is a matter related to fundamental principle of nuclear structure theories. In this lecture, the fundamental concept and the framework of the theory of nuclear structure based on the mean field theory and the shell model are presented to make clear the problems and to suggest directions for future researches. At first fundamental properties of nuclei are described under the subtitles: saturation and magic numbers, nuclear force and effective interactions, nuclear matter, and LS splitting. Then the mean field theory is presented under subtitles: the potential model, the mean field theory, Hartree-Fock approximation for nuclear matter, density dependent force, semiclassical mean field theory, mean field theory and symmetry, Skyrme interaction and density functional, density matrix expansion, finite range interactions, effective masses, and motion of center of mass. The subsequent section is devoted to the shell model with the subtitles: beyond the mean field approximation, core polarization, effective interaction of shell model, one-particle wave function, nuclear deformation and shell model, and shell model of cross shell. Finally structure of unstable nuclei is discussed with the subtitles: general remark on the study of unstable nuclear structure, asymptotic behavior of wave

  17. Symmetry-guided large-scale shell-model theory

    Czech Academy of Sciences Publication Activity Database

    Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.

    2016-01-01

    Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell -model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016

  18. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  19. Physics of human cooperation: experimental evidence and theoretical models

    Science.gov (United States)

    Sánchez, Angel

    2018-02-01

    In recent years, many physicists have used evolutionary game theory combined with a complex systems perspective in an attempt to understand social phenomena and challenges. Prominent among such phenomena is the issue of the emergence and sustainability of cooperation in a networked world of selfish or self-focused individuals. The vast majority of research done by physicists on these questions is theoretical, and is almost always posed in terms of agent-based models. Unfortunately, more often than not such models ignore a number of facts that are well established experimentally, and are thus rendered irrelevant to actual social applications. I here summarize some of the facts that any realistic model should incorporate and take into account, discuss important aspects underlying the relation between theory and experiments, and discuss future directions for research based on the available experimental knowledge.

  20. Growing up and Role Modeling: A Theory in Iranian Nursing Students? Education

    OpenAIRE

    Nouri, Jamileh Mokhtari; Ebadi, Abbas; Alhani, Fatemeh; Rejeh, Nahid

    2014-01-01

    One of the key strategies in students? learning is being affected by models. Understanding the role-modeling process in education will help to make greater use of this training strategy. The aim of this grounded theory study was to explore Iranian nursing students and instructors? experiences about role modeling process. Data was analyzed by Glaserian?s Grounded Theory methodology through semi-structured interviews with 7 faculty members, 2 nursing students; the three focus group discussions ...

  1. Consistent constraints on the Standard Model Effective Field Theory

    International Nuclear Information System (INIS)

    Berthier, Laure; Trott, Michael

    2016-01-01

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.

  2. Learned Helplessness: Theory and Evidence

    Science.gov (United States)

    Maier, Steven F.; Seligman, Martin E. P.

    1976-01-01

    Authors believes that three phenomena are all instances of "learned helplessness," instances in which an organism has learned that outcomes are uncontrollable by his responses and is seriously debilitated by this knowledge. This article explores the evidence for the phenomena of learned helplessness, and discussed a variety of theoretical…

  3. 2 + 1 quantum gravity as a toy model for the 3 + 1 theory

    International Nuclear Information System (INIS)

    Ashtekar, A.; Husain, V.; Smolin, L.; Samuel, J.; Utah Univ., Salt Lake City, UT

    1989-01-01

    2 + 1 Einstein gravity is used as a toy model for testing a program for non-perturbative canonical quantisation of the 3 + 1 theory. The program can be successfully implemented in the model and leads to a surprisingly rich quantum theory. (author)

  4. Noncommutative Geometry in M-Theory and Conformal Field Theory

    International Nuclear Information System (INIS)

    Morariu, Bogdan

    1999-01-01

    In the first part of the thesis I will investigate in the Matrix theory framework, the subgroup of dualities of the Discrete Light Cone Quantization of M-theory compactified on tori, which corresponds to T-duality in the auxiliary Type II string theory. After a review of matrix theory compactification leading to noncommutative supersymmetric Yang-Mills gauge theory, I will present solutions for the fundamental and adjoint sections on a two-dimensional twisted quantum torus and generalize to three-dimensional twisted quantum tori. After showing how M-theory T-duality is realized in supersymmetric Yang-Mills gauge theories on dual noncommutative tori I will relate this to the mathematical concept of Morita equivalence of C*-algebras. As a further generalization, I consider arbitrary Ramond-Ramond backgrounds. I will also discuss the spectrum of the toroidally compactified Matrix theory corresponding to quantized electric fluxes on two and three tori. In the second part of the thesis I will present an application to conformal field theory involving quantum groups, another important example of a noncommutative space. First, I will give an introduction to Poisson-Lie groups and arrive at quantum groups using the Feynman path integral. I will quantize the symplectic leaves of the Poisson-Lie group SU(2)*. In this way we obtain the unitary representations of U q (SU(2)). I discuss the X-structure of SU(2)* and give a detailed description of its leaves using various parametrizations. Then, I will introduce a new reality structure on the Heisenberg double of Fun q (SL(N,C)) for q phase, which can be interpreted as the quantum phase space of a particle on the q-deformed mass-hyperboloid. I also present evidence that the above real form describes zero modes of certain non-compact WZNW-models

  5. Noncommutative Geometry in M-Theory and Conformal Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Morariu, Bogdan [Univ. of California, Berkeley, CA (United States)

    1999-05-01

    In the first part of the thesis I will investigate in the Matrix theory framework, the subgroup of dualities of the Discrete Light Cone Quantization of M-theory compactified on tori, which corresponds to T-duality in the auxiliary Type II string theory. After a review of matrix theory compactification leading to noncommutative supersymmetric Yang-Mills gauge theory, I will present solutions for the fundamental and adjoint sections on a two-dimensional twisted quantum torus and generalize to three-dimensional twisted quantum tori. After showing how M-theory T-duality is realized in supersymmetric Yang-Mills gauge theories on dual noncommutative tori I will relate this to the mathematical concept of Morita equivalence of C*-algebras. As a further generalization, I consider arbitrary Ramond-Ramond backgrounds. I will also discuss the spectrum of the toroidally compactified Matrix theory corresponding to quantized electric fluxes on two and three tori. In the second part of the thesis I will present an application to conformal field theory involving quantum groups, another important example of a noncommutative space. First, I will give an introduction to Poisson-Lie groups and arrive at quantum groups using the Feynman path integral. I will quantize the symplectic leaves of the Poisson-Lie group SU(2)*. In this way we obtain the unitary representations of Uq(SU(2)). I discuss the X-structure of SU(2)* and give a detailed description of its leaves using various parametrizations. Then, I will introduce a new reality structure on the Heisenberg double of Funq (SL(N,C)) for q phase, which can be interpreted as the quantum phase space of a particle on the q-deformed mass-hyperboloid. I also present evidence that the above real form describes zero modes of certain non-compact WZNW-models.

  6. Deformed type 0A matrix model and super-Liouville theory for fermionic black holes

    International Nuclear Information System (INIS)

    Ahn, Changrim; Kim, Chanju; Park, Jaemo; Suyama, Takao; Yamamoto, Masayoshi

    2006-01-01

    We consider a c-circumflex = 1 model in the fermionic black hole background. For this purpose we consider a model which contains both the N 1 and the N = 2 super-Liouville interactions. We propose that this model is dual to a recently proposed type 0A matrix quantum mechanics model with vortex deformations. We support our conjecture by showing that non-perturbative corrections to the free energy computed by both the matrix model and the super-Liouville theories agree exactly by treating the N = 2 interaction as a small perturbation. We also show that a two-point function on sphere calculated from the deformed type 0A matrix model is consistent with that of the N = 2 super-Liouville theory when the N = 1 interaction becomes small. This duality between the matrix model and super-Liouville theories leads to a conjecture for arbitrary n-point correlation functions of the N = 1 super-Liouville theory on the sphere

  7. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  8. Prospect Theory in the Heterogeneous Agent Model

    Czech Academy of Sciences Publication Activity Database

    Polach, J.; Kukačka, Jiří

    (2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economic s OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438.pdf

  9. Model building with a dynamical volume element in gravity, particle theory and theories of extended object

    International Nuclear Information System (INIS)

    Guendelman, E.

    2004-01-01

    Full Text:The Volume Element of Space Time can be considered as a geometrical object which can be independent of the metric. The use in the action of a volume element which is metric independent leads to the appearance of a measure of integration which is metric independent. This can be applied to all known generally coordinate invariant theories, we will discuss three very important cases: 1. 4-D theories describing gravity and matter fields, 2. Parametrization invariant theories of extended objects and 3. Higher dimensional theories including gravity and matter fields. In case 1, a large number of new effects appear: (i) spontaneous breaking of scale invariance associated to integration of degrees of freedom related to the measure, (ii) under normal particle physics laboratory conditions fermions split into three families, but when matter is highly diluted, neutrinos increase their mass and become suitable candidates for dark matter, (iii) cosmic coincidence between dark energy and dark matter is natural, (iv) quintessence scenarios with automatic decoupling of the quintessence scalar to ordinary matter, but not dark matter are obtained (2) For theories or extended objects, the use of a measure of integration independent of the metric leads to (i) dynamical tension, (ii) string models of non abelian confinement (iii) The possibility of new Weyl invariant light-like branes (WTT.L branes). These Will branes dynamically adjust themselves to sit at black hole horizons and in the context of higher dimensional theories can provide examples of massless 4-D particles with nontrivial Kaluza Klein quantum numbers, (3) In Bronx and Kaluza Klein scenarios, the use of a measure independent of the metric makes it possible to construct naturally models where only the extra dimensions get curved and the 4-D observable space-time remain flat

  10. Applying theories to better understand socio-political challenges in implementing evidence-based work disability prevention strategies.

    Science.gov (United States)

    Ståhl, Christian; Costa-Black, Katia; Loisel, Patrick

    2018-04-01

    This article explores and applies theories for analyzing socio-political aspects of implementation of work disability prevention (WDP) strategies. For the analysis, theories from political science are explained and discussed in relation to case examples from three jurisdictions (Sweden, Brazil and Québec). Implementation of WDP strategies may be studied through a conceptual framework that targets: (1) the institutional system in which policy-makers and other stakeholders reside; (2) the ambiguity and conflicts regarding what to do and how to do it; (3) the bounded rationality, path dependency and social systems of different stakeholders; and (4) coalitions formed by different stakeholders and power relations between them. In the case examples, the design of social insurance systems, the access to and infrastructure of healthcare systems, labor market policies, employers' level of responsibility, the regulatory environment, and the general knowledge of WDP issues among stakeholders played different roles in the implementation of policies based on scientific evidence. Future research may involve participatory approaches focusing on building coalitions and communities of practice with policy-makers and stakeholders, in order to build trust, facilitate cooperation, and to better promote evidence utilization. Implications for Rehabilitation Implementation of work disability prevention policies are subject to contextual influences from the socio-political setting and from relationships between stakeholders Stakeholders involved in implementing strategies are bound to act based on their interests and previous courses of action To promote research uptake on the policy level, stakeholders and researchers need to engage in collaboration and translational activities Political stakeholders at the government and community levels need to be more directly involved as partners in the production and utilization of evidence.

  11. Anomaly-free gauges in superstring theory and double supersymmetric sigma-model

    International Nuclear Information System (INIS)

    Demichev, A.P.; Iofa, M.Z.

    1991-01-01

    Superharmonic gauge which is a nontrivial analog of the harmonic gauge in bosonic string theory is constructed for the fermionic superstrings. In contrast to the conformal gauge, the harmonic gauge in bosonic string and superharmonic gauge in superstring theory are shown to be free from previously discovered BRST anomaly (in critical dimension) in higher orders of string perturbation theory and thus provide the setup for consistent quantization of (super)string theory. Superharmonic gauge appears to be closely connected with the supersymmetric σ-model with the target space being also a supermanifold. 28 refs

  12. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  13. Exact string theory model of closed timelike curves and cosmological singularities

    International Nuclear Information System (INIS)

    Johnson, Clifford V.; Svendsen, Harald G.

    2004-01-01

    We study an exact model of string theory propagating in a space-time containing regions with closed timelike curves (CTCs) separated from a finite cosmological region bounded by a big bang and a big crunch. The model is an nontrivial embedding of the Taub-NUT geometry into heterotic string theory with a full conformal field theory (CFT) definition, discovered over a decade ago as a heterotic coset model. Having a CFT definition makes this an excellent laboratory for the study of the stringy fate of CTCs, the Taub cosmology, and the Milne/Misner-type chronology horizon which separates them. In an effort to uncover the role of stringy corrections to such geometries, we calculate the complete set of α ' corrections to the geometry. We observe that the key features of Taub-NUT persist in the exact theory, together with the emergence of a region of space with Euclidean signature bounded by timelike curvature singularities. Although such remarks are premature, their persistence in the exact geometry is suggestive that string theory is able to make physical sense of the Milne/Misner singularities and the CTCs, despite their pathological character in general relativity. This may also support the possibility that CTCs may be viable in some physical situations, and may be a natural ingredient in pre-big bang cosmological scenarios

  14. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  15. Models accounting for intention-behavior discordance in the physical activity domain: a user's guide, content overview, and review of current evidence.

    Science.gov (United States)

    Rhodes, Ryan E; Yao, Christopher A

    2015-02-07

    There is a growing concern among researchers with the limited effectiveness and yet subsequent stagnation of theories applied to physical activity (PA). One of the most highlighted areas of concern is the established gap between intention and PA, yet the considerable use of models that assume intention is the proximal antecedent of PA. The objective of this review was to: 1) provide a guide and thematic analysis of the available models that include constructs that address intention-behavior discordance and 2) highlight the evidence for these structures in the PA domain. A literature search was conducted among 13 major databases to locate relevant models and PA studies published before August 2014. Sixteen models were identified and nine overall themes for post-intentional constructs were created. Of the 16 models, eight were applied to 36 PA studies. Early evidence supported maintenance self-efficacy, behavioral regulation strategies, affective judgments, perceived control/opportunity, habit, and extraversion as reliable predictors of post-intention PA. Several intention-behavior discordance models exist within the literature, but are not used frequently. Further efforts are needed to test these models, preferably with experimental designs.

  16. Background field method in gauge theories and on linear sigma models

    International Nuclear Information System (INIS)

    van de Ven, A.E.M.

    1986-01-01

    This dissertation constitutes a study of the ultraviolet behavior of gauge theories and two-dimensional nonlinear sigma-models by means of the background field method. After a general introduction in chapter 1, chapter 2 presents algorithms which generate the divergent terms in the effective action at one-loop for arbitrary quantum field theories in flat spacetime of dimension d ≤ 11. It is demonstrated that global N = 1 supersymmetric Yang-Mills theory in six dimensions in one-loop UV-finite. Chapter 3 presents an algorithm which produces the divergent terms in the effective action at two-loops for renormalizable quantum field theories in a curved four-dimensional background spacetime. Chapter 4 presents a study of the two-loop UV-behavior of two-dimensional bosonic and supersymmetric non-linear sigma-models which include a Wess-Zumino-Witten term. It is found that, to this order, supersymmetric models on quasi-Ricci flat spaces are UV-finite and the β-functions for the bosonic model depend only on torsionful curvatures. Chapter 5 summarizes a superspace calculation of the four-loop β-function for two-dimensional N = 1 and N = 2 supersymmetric non-linear sigma-models. It is found that besides the one-loop contribution which vanishes on Ricci-flat spaces, the β-function receives four-loop contributions which do not vanish in the Ricci-flat case. Implications for superstrings are discussed. Chapters 6 and 7 treat the details of these calculations

  17. The Five-Factor Model and Self-Determination Theory

    DEFF Research Database (Denmark)

    Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette

    This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...

  18. Perturbation theory around the Wess-Zumino-Witten model

    International Nuclear Information System (INIS)

    Hasseln, H. v.

    1991-05-01

    We consider a perturbation of the Wess-Zumino-Witten model in 2D by a current-current interaction. The β-function is computed to third order in the coupling constant and a nontrivial fixedpoint is found. By non-abelian bosonization, this perturbed WZW-model is shown to have the same β-function (at least to order g 2 ) as the fermionic theory with a four-fermion interaction. (orig.) [de

  19. Is the Quantity Theory of Money Useful in Forecasting U.S. Inflation?

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani; Nyberg, Henri

    We propose a new simple model incorporating the implication of the quantity theory of money that money growth and ináation should move one for one in the long run, and, hence, ináation should be predictable by money growth. The model Öts postwar U.S. data well, and beats common univariate benchma...... models in forecasting ináation. Moreover, this evidence is quite robust, and predictability is found also in the Great moderation period. The detected predictability of ináation by money growth lends support to the quantity theory....

  20. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice.

    Science.gov (United States)

    Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane

    2016-02-01

    To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and

  1. From the Neutral Theory to a Comprehensive and Multiscale Theory of Ecological Equivalence.

    Science.gov (United States)

    Munoz, François; Huneman, Philippe

    2016-09-01

    The neutral theory of biodiversity assumes that coexisting organisms are equally able to survive, reproduce, and disperse (ecological equivalence), but predicts that stochastic fluctuations of these abilities drive diversity dynamics. It predicts remarkably well many biodiversity patterns, although substantial evidence for the role of niche variation across organisms seems contradictory. Here, we discuss this apparent paradox by exploring the meaning and implications of ecological equivalence. We address the question whether neutral theory provides an explanation for biodiversity patterns and acknowledges causal processes. We underline that ecological equivalence, although central to neutral theory, can emerge at local and regional scales from niche-based processes through equalizing and stabilizing mechanisms. Such emerging equivalence corresponds to a weak conception of neutral theory, as opposed to the assumption of strict equivalence at the individual level in strong conception. We show that this duality is related to diverging views on hypothesis testing and modeling in ecology. In addition, the stochastic dynamics exposed in neutral theory are pervasive in ecological systems and, rather than a null hypothesis, ecological equivalence is best understood as a parsimonious baseline to address biodiversity dynamics at multiple scales.

  2. Embedding inflation into the Standard Model — More evidence for classical scale invariance

    International Nuclear Information System (INIS)

    Kannike, Kristjan; Racioppi, Antonio; Raidal, Martti

    2014-01-01

    If cosmological inflation is due to a slowly rolling single inflation field taking trans-Planckian values as suggested by the BICEP2 measurement of primordial tensor modes in CMB, embedding inflation into the Standard Model challenges standard paradigm of effective field theories. Together with an apparent absence of Planck scale contributions to the Higgs mass and to the cosmological constant, BICEP2 provides further experimental evidence for the absence of large M_P induced operators. We show that classical scale invariance — the paradigm that all fundamental scales in Nature are induced by quantum effects — solves the problem and allows for a remarkably simple scale-free Standard Model extension with inflaton without extending the gauge group. Due to trans-Planckian inflaton values and vevs, a dynamically induced Coleman-Weinberg-type inflaton potential of the model can predict tensor-to-scalar ratio r in a large range, converging around the prediction of chaotic m"2ϕ"2 inflation for a large trans-Planckian value of the inflaton vev. Precise determination of r in future experiments will single out a unique scale-free inflation potential, allowing to test the proposed field-theoretic framework.

  3. Theories and control models and motor learning: clinical applications in neuro-rehabilitation.

    Science.gov (United States)

    Cano-de-la-Cuerda, R; Molero-Sánchez, A; Carratalá-Tejada, M; Alguacil-Diego, I M; Molina-Rueda, F; Miangolarra-Page, J C; Torricelli, D

    2015-01-01

    In recent decades there has been a special interest in theories that could explain the regulation of motor control, and their applications. These theories are often based on models of brain function, philosophically reflecting different criteria on how movement is controlled by the brain, each being emphasised in different neural components of the movement. The concept of motor learning, regarded as the set of internal processes associated with practice and experience that produce relatively permanent changes in the ability to produce motor activities through a specific skill, is also relevant in the context of neuroscience. Thus, both motor control and learning are seen as key fields of study for health professionals in the field of neuro-rehabilitation. The major theories of motor control are described, which include, motor programming theory, systems theory, the theory of dynamic action, and the theory of parallel distributed processing, as well as the factors that influence motor learning and its applications in neuro-rehabilitation. At present there is no consensus on which theory or model defines the regulations to explain motor control. Theories of motor learning should be the basis for motor rehabilitation. The new research should apply the knowledge generated in the fields of control and motor learning in neuro-rehabilitation. Copyright © 2011 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.

  4. Infrared fixed point of SU(2) gauge theory with six flavors

    Science.gov (United States)

    Leino, Viljami; Rummukainen, Kari; Suorsa, Joni; Tuominen, Kimmo; Tähtinen, Sara

    2018-06-01

    We compute the running of the coupling in SU(2) gauge theory with six fermions in the fundamental representation of the gauge group. We find strong evidence that this theory has an infrared stable fixed point at strong coupling and measure also the anomalous dimension of the fermion mass operator at the fixed point. This theory therefore likely lies close to the boundary of the conformal window and will display novel infrared dynamics if coupled with the electroweak sector of the Standard Model.

  5. Twenty years of load theory-Where are we now, and where should we go next?

    Science.gov (United States)

    Murphy, Gillian; Groeger, John A; Greene, Ciara M

    2016-10-01

    Selective attention allows us to ignore what is task-irrelevant and focus on what is task-relevant. The cognitive and neural mechanisms that underlie this process are key topics of investigation in cognitive psychology. One of the more prominent theories of attention is perceptual load theory, which suggests that the efficiency of selective attention is dependent on both perceptual and cognitive load. It is now more than 20 years since the proposal of load theory, and it is a good time to evaluate the evidence in support of this influential model. The present article supplements and extends previous reviews (Lavie, Trends in Cognitive Sciences, 9, 75-82. doi: 10.1016/j.tics.2004.12.004 , 2005, Current Directions in Psychological Science, 19, 143-148. doi: 10.1177/0963721410370295 , 2010) by examining more recent research in what appears to be a rapidly expanding area. The article comprises five parts, examining (1) evidence for the effects of perceptual load on attention, (2) cognitive load, (3) individual differences under load, (4) alternative theories and criticisms, and (5) the future of load theory. We argue that the key next step for load theory will be the application of the model to real-world tasks. The potential benefits of applied attention research are numerous, and there is tentative evidence that applied research would provide strong support for the theory itself, as well as real-world benefits related to activities in which attention is crucial, such as driving and education.

  6. I can do that: the impact of implicit theories on leadership role model effectiveness.

    Science.gov (United States)

    Hoyt, Crystal L; Burnette, Jeni L; Innella, Audrey N

    2012-02-01

    This research investigates the role of implicit theories in influencing the effectiveness of successful role models in the leadership domain. Across two studies, the authors test the prediction that incremental theorists ("leaders are made") compared to entity theorists ("leaders are born") will respond more positively to being presented with a role model before undertaking a leadership task. In Study 1, measuring people's naturally occurring implicit theories of leadership, the authors showed that after being primed with a role model, incremental theorists reported greater leadership confidence and less anxious-depressed affect than entity theorists following the leadership task. In Study 2, the authors demonstrated the causal role of implicit theories by manipulating participants' theory of leadership ability. They replicated the findings from Study 1 and demonstrated that identification with the role model mediated the relationship between implicit theories and both confidence and affect. In addition, incremental theorists outperformed entity theorists on the leadership task.

  7. Theory for the three-dimensional Mercedes-Benz model of water

    Science.gov (United States)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.

    2009-11-01

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  8. Theory for the three-dimensional Mercedes-Benz model of water.

    Science.gov (United States)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A

    2009-11-21

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  9. Traffic Games: Modeling Freeway Traffic with Game Theory.

    Science.gov (United States)

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  10. Modelling machine ensembles with discrete event dynamical system theory

    Science.gov (United States)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  11. Theories and Frameworks for Online Education: Seeking an Integrated Model

    Science.gov (United States)

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  12. Halo modelling in chameleon theories

    Energy Technology Data Exchange (ETDEWEB)

    Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  13. Halo modelling in chameleon theories

    International Nuclear Information System (INIS)

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu

    2014-01-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations

  14. Risk Route Choice Analysis and the Equilibrium Model under Anticipated Regret Theory

    Directory of Open Access Journals (Sweden)

    pengcheng yuan

    2014-02-01

    Full Text Available The assumption about travellers’ route choice behaviour has major influence on the traffic flow equilibrium analysis. Previous studies about the travellers’ route choice were mainly based on the expected utility maximization theory. However, with the gradually increasing knowledge about the uncertainty of the transportation system, the researchers have realized that there is much constraint in expected util­ity maximization theory, because expected utility maximiza­tion requires travellers to be ‘absolutely rational’; but in fact, travellers are not truly ‘absolutely rational’. The anticipated regret theory proposes an alternative framework to the tra­ditional risk-taking in route choice behaviour which might be more scientific and reasonable. We have applied the antici­pated regret theory to the analysis of the risk route choosing process, and constructed an anticipated regret utility func­tion. By a simple case which includes two parallel routes, the route choosing results influenced by the risk aversion degree, regret degree and the environment risk degree have been analyzed. Moreover, the user equilibrium model based on the anticipated regret theory has been established. The equivalence and the uniqueness of the model are proved; an efficacious algorithm is also proposed to solve the model. Both the model and the algorithm are demonstrated in a real network. By an experiment, the model results and the real data have been compared. It was found that the model re­sults can be similar to the real data if a proper regret degree parameter is selected. This illustrates that the model can better explain the risk route choosing behaviour. Moreover, it was also found that the traveller’ regret degree increases when the environment becomes more and more risky.

  15. Theory of positive disintegration as a model of adolescent development.

    Science.gov (United States)

    Laycraft, Krystyna

    2011-01-01

    This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.

  16. A work-family conflict/subjective well-being process model: a test of competing theories of longitudinal effects.

    Science.gov (United States)

    Matthews, Russell A; Wayne, Julie Holliday; Ford, Michael T

    2014-11-01

    In the present study, we examine competing predictions of stress reaction models and adaptation theories regarding the longitudinal relationship between work-family conflict and subjective well-being. Based on data from 432 participants over 3 time points with 2 lags of varying lengths (i.e., 1 month, 6 months), our findings suggest that in the short term, consistent with prior theory and research, work-family conflict is associated with poorer subjective well-being. Counter to traditional work-family predictions but consistent with adaptation theories, after accounting for concurrent levels of work-family conflict as well as past levels of subjective well-being, past exposure to work-family conflict was associated with higher levels of subjective well-being over time. Moreover, evidence was found for reverse causation in that greater subjective well-being at 1 point in time was associated with reduced work-family conflict at a subsequent point in time. Finally, the pattern of results did not vary as a function of using different temporal lags. We discuss the theoretical, research, and practical implications of our findings. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. Goodness-of-Fit Assessment of Item Response Theory Models

    Science.gov (United States)

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  18. Prospect theory for online financial trading.

    Science.gov (United States)

    Liu, Yang-Yu; Nacher, Jose C; Ochiai, Tomoshiro; Martino, Mauro; Altshuler, Yaniv

    2014-01-01

    Prospect theory is widely viewed as the best available descriptive model of how people evaluate risk in experimental settings. According to prospect theory, people are typically risk-averse with respect to gains and risk-seeking with respect to losses, known as the "reflection effect". People are much more sensitive to losses than to gains of the same magnitude, a phenomenon called "loss aversion". Despite of the fact that prospect theory has been well developed in behavioral economics at the theoretical level, there exist very few large-scale empirical studies and most of the previous studies have been undertaken with micro-panel data. Here we analyze over 28.5 million trades made by 81.3 thousand traders of an online financial trading community over 28 months, aiming to explore the large-scale empirical aspect of prospect theory. By analyzing and comparing the behavior of winning and losing trades and traders, we find clear evidence of the reflection effect and the loss aversion phenomenon, which are essential in prospect theory. This work hence demonstrates an unprecedented large-scale empirical evidence of prospect theory, which has immediate implication in financial trading, e.g., developing new trading strategies by minimizing the impact of the reflection effect and the loss aversion phenomenon. Moreover, we introduce three novel behavioral metrics to differentiate winning and losing traders based on their historical trading behavior. This offers us potential opportunities to augment online social trading where traders are allowed to watch and follow the trading activities of others, by predicting potential winners based on their historical trading behavior.

  19. Prospect theory for online financial trading.

    Directory of Open Access Journals (Sweden)

    Yang-Yu Liu

    Full Text Available Prospect theory is widely viewed as the best available descriptive model of how people evaluate risk in experimental settings. According to prospect theory, people are typically risk-averse with respect to gains and risk-seeking with respect to losses, known as the "reflection effect". People are much more sensitive to losses than to gains of the same magnitude, a phenomenon called "loss aversion". Despite of the fact that prospect theory has been well developed in behavioral economics at the theoretical level, there exist very few large-scale empirical studies and most of the previous studies have been undertaken with micro-panel data. Here we analyze over 28.5 million trades made by 81.3 thousand traders of an online financial trading community over 28 months, aiming to explore the large-scale empirical aspect of prospect theory. By analyzing and comparing the behavior of winning and losing trades and traders, we find clear evidence of the reflection effect and the loss aversion phenomenon, which are essential in prospect theory. This work hence demonstrates an unprecedented large-scale empirical evidence of prospect theory, which has immediate implication in financial trading, e.g., developing new trading strategies by minimizing the impact of the reflection effect and the loss aversion phenomenon. Moreover, we introduce three novel behavioral metrics to differentiate winning and losing traders based on their historical trading behavior. This offers us potential opportunities to augment online social trading where traders are allowed to watch and follow the trading activities of others, by predicting potential winners based on their historical trading behavior.

  20. Mixmaster cosmological model in theories of gravity with a quadratic Lagrangian

    International Nuclear Information System (INIS)

    Barrow, J.D.; Sirousse-Zia, H.

    1989-01-01

    We use the method of matched asymptotic expansions to examine the behavior of the vacuum Bianchi type-IX mixmaster universe in a gravity theory derived from a purely quadratic gravitational Lagrangian. The chaotic behavior characteristic of the general-relativistic mixmaster model disappears and the asymptotic behavior is of the monotonic, nonchaotic form found in the exactly soluble Bianchi type-I models of the quadratic theory. The asymptotic behavior far from the singularity is also found to be of monotonic nonchaotic type

  1. Models in cooperative game theory

    CERN Document Server

    Branzei, Rodica; Tijs, Stef

    2008-01-01

    This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.

  2. Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items

    Science.gov (United States)

    Aybek, Eren Can; Demirtasli, R. Nukhet

    2017-01-01

    This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…

  3. Lenses on reading an introduction to theories and models

    CERN Document Server

    Tracey, Diane H

    2017-01-01

    Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a

  4. Comparison of potential models through heavy quark effective theory

    International Nuclear Information System (INIS)

    Amundson, J.F.

    1995-01-01

    I calculate heavy-light decay constants in a nonrelativistic potential model. The resulting estimate of heavy quark symmetry breaking conflicts with similar estimates from lattice QCD. I show that a semirelativistic potential model eliminates the conflict. Using the results of heavy quark effective theory allows me to identify and compensate for shortcomings in the model calculations in addition to isolating the source of the differences in the two models. The results lead to a rule as to where the nonrelativistic quark model gives misleading predictions

  5. Refined pipe theory for mechanistic modeling of wood development.

    Science.gov (United States)

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  6. The spin-s quantum Heisenberg ferromagnetic models in the physical magnon theory

    International Nuclear Information System (INIS)

    Liu, B.-G.; Pu, F.-C.

    2001-01-01

    The spin-s quantum Heisenberg ferromagnetic model is investigated in the physical magnon theory. The effect of the extra unphysical magnon states on every site is completely removed in the magnon Hamiltonian and during approximation procedure so that the condition †n i a n i >=0(n≥2s+1) is rigorously satisfied. The physical multi-magnon occupancy †n i a n i >(1≤n≤2s) is proportional to T 3n/2 at low temperature and is equivalent to 1/(2s+1) at the Curie temperature. The magnetization not only unified but also well-behaved from zero temperature to Curie temperature is obtained in the framework of the magnon theory for the spin-s quantum Heisenberg ferromagnetic model. The ill-behaved magnetizations at high temperature in earlier magnon theories are completely corrected. The relation of magnon (spin wave) theory with spin-operator decoupling theory is clearly understood

  7. Quantum analysis of Jackiw and Teitelboim's model for (1+1)D gravity and topological gauge theory

    International Nuclear Information System (INIS)

    Terao, Haruhiko

    1993-01-01

    We study the BRST quantization of the (1+1)-dimensional gravity model proposed by Jackiw and Teitelboim and also the topological gauge model which is equivalent to the gravity model at least classically. The gravity model quantized in the light-cone gauge is found to be a free theory with a nilpotent BRST charge. We show also that there exist twisted N=2 superconformal algebras in the Jackiw-Teitelboim model as well as in the topological gauge model. We discuss the quantum equivalence between the gravity theory and the topological gauge theory. It is shown that these theories are indeed equivalent to each other in the light-cone gauge. (orig.)

  8. DSM-5 alternative personality disorder model traits as maladaptive extreme variants of the five-factor model: An item-response theory analysis.

    Science.gov (United States)

    Suzuki, Takakuni; Samuel, Douglas B; Pahlen, Shandell; Krueger, Robert F

    2015-05-01

    Over the past two decades, evidence has suggested that personality disorders (PDs) can be conceptualized as extreme, maladaptive variants of general personality dimensions, rather than discrete categorical entities. Recognizing this literature, the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) alternative PD model in Section III defines PDs partially through 25 maladaptive traits that fall within 5 domains. Empirical evidence based on the self-report measure of these traits, the Personality Inventory for DSM-5 (PID-5), suggests that these five higher-order domains share a structure and correlate in meaningful ways with the five-factor model (FFM) of general personality. In the current study, item response theory was used to compare the DSM-5 alternative PD model traits to those from a normative FFM inventory (the International Personality Item Pool-NEO [IPIP-NEO]) in terms of their measurement precision along the latent dimensions. Within a combined sample of 3,517 participants, results strongly supported the conclusion that the DSM-5 alternative PD model traits and IPIP-NEO traits are complimentary measures of 4 of the 5 FFM domains (with perhaps the exception of openness to experience vs. psychoticism). Importantly, the two measures yield largely overlapping information curves on these four domains. Differences that did emerge suggested that the PID-5 scales generally have higher thresholds and provide more information at the upper levels, whereas the IPIP-NEO generally had an advantage at the lower levels. These results support the general conceptualization that 4 domains of the DSM-5 alternative PD model traits are maladaptive, extreme versions of the FFM. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  9. Complexity in quantum field theory and physics beyond the standard model

    International Nuclear Information System (INIS)

    Goldfain, Ervin

    2006-01-01

    Complex quantum field theory (abbreviated c-QFT) is introduced in this paper as an alternative framework for the description of physics beyond the energy range of the standard model. The mathematics of c-QFT is based on fractal differential operators that generalize the momentum operators of conventional quantum field theory (QFT). The underlying premise of our approach is that c-QFT contains the right analytical tools for dealing with the asymptotic regime of QFT. Canonical quantization of c-QFT leads to the following findings: (i) the Fock space of c-QFT includes fractional numbers of particles and antiparticles per state (ii) c-QFT represents a generalization of topological field theory and (iii) classical limit of c-QFT is equivalent to field theory in curved space-time. The first finding provides a field-theoretic motivation for the transfinite discretization approach of El-Naschie's ε (∞) theory. The second and third findings suggest the dynamic unification of boson and fermion fields as particles with fractional spin, as well as the close connection between spin and space-time topology beyond the conventional physics of the standard model

  10. Complexity in quantum field theory and physics beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Goldfain, Ervin [OptiSolve Consulting, 4422 Cleveland Road, Syracuse, NY 13215 (United States)

    2006-05-15

    Complex quantum field theory (abbreviated c-QFT) is introduced in this paper as an alternative framework for the description of physics beyond the energy range of the standard model. The mathematics of c-QFT is based on fractal differential operators that generalize the momentum operators of conventional quantum field theory (QFT). The underlying premise of our approach is that c-QFT contains the right analytical tools for dealing with the asymptotic regime of QFT. Canonical quantization of c-QFT leads to the following findings: (i) the Fock space of c-QFT includes fractional numbers of particles and antiparticles per state (ii) c-QFT represents a generalization of topological field theory and (iii) classical limit of c-QFT is equivalent to field theory in curved space-time. The first finding provides a field-theoretic motivation for the transfinite discretization approach of El-Naschie's {epsilon} {sup ({infinity}}{sup )} theory. The second and third findings suggest the dynamic unification of boson and fermion fields as particles with fractional spin, as well as the close connection between spin and space-time topology beyond the conventional physics of the standard model.

  11. Interacting bosons model and relation with BCS theory

    International Nuclear Information System (INIS)

    Diniz, R.

    1990-01-01

    The Nambu mechanism for BCS theory is extended with inclusion of quadrupole pairing in addition to the usual monopole pairing. An effective Hamiltonian is constructed and its relation to the IBM is discussed. The faced difficulties and a possible generalization of this model are discussed. (author)

  12. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  13. Epistemologic inquiries in evidence-based medicine.

    Science.gov (United States)

    Djulbegovic, Benjamin; Guyatt, Gordon H; Ashcroft, Richard E

    2009-04-01

    Since the term "evidence-based medicine" (EBM) first appeared in the scientific literature in 1991, the concept has had considerable influence in many parts of the world. Most professional societies, the public,and funding agencies have accepted EBM with remarkable enthusiasm. The concept of evidence-based practice is now applied in management, education, criminology, and social work. Yet, EBM has attracted controversy: its critics allege that EBM uses a narrow concept of evidence and a naive conception of the relationships between evidence, theory, and practice. They also contend that EBM presents itself as a radical restructuring of medical knowledge that discredits more traditional ways of knowing in medicine, largely in the interests of people with a particular investment in the enterprise of large-scale clinical trials. Because EBM proposes aspecific relationship between theory, evidence, and knowledge, its theoretical basis can be understood as an epistemological system. Undertaking epistemological inquiry is important because the adoption of a particular epistemological view defines how science is conducted. In this paper, we challenge this critical view of EBM by examining how EBM fits into broad epistemological debates within the philosophy of science. We consider how EBM relates to some classical debates regarding the nature of science and knowledge. We investigate EBM from the perspective of major epistemological theories (logical-positivism/inductivism, deductivism/falsificationism/theory-ladeness of observations, explanationism/holism, instrumentalism, underdetermination theory by evidence). We first explore the relationship between evidence and knowledge and discuss philosophical support for the main way that evidence is used in medicine: (1) in the philosophical tradition that "rational thinkers respect their evidence," we show that EBM refers to making medical decisions that are consistent with evidence, (2) as a reliable sign, symptom, or mark to

  14. A thermostatted kinetic theory model for event-driven pedestrian dynamics

    Science.gov (United States)

    Bianca, Carlo; Mogno, Caterina

    2018-06-01

    This paper is devoted to the modeling of the pedestrian dynamics by means of the thermostatted kinetic theory. Specifically the microscopic interactions among pedestrians and an external force field are modeled for simulating the evacuation of pedestrians from a metro station. The fundamentals of the stochastic game theory and the thermostatted kinetic theory are coupled for the derivation of a specific mathematical model which depicts the time evolution of the distribution of pedestrians at different exits of a metro station. The perturbation theory is employed in order to establish the stability analysis of the nonequilibrium stationary states in the case of a metro station consisting of two exits. A general sensitivity analysis on the initial conditions, the magnitude of the external force field and the number of exits is presented by means of numerical simulations which, in particular, show how the asymptotic distribution and the convergence time are affected by the presence of an external force field. The results show how, in evacuation conditions, the interaction dynamics among pedestrians can be negligible with respect to the external force. The important role of the thermostat term in allowing the reaching of the nonequilibrium stationary state is stressed out. Research perspectives are underlined at the end of paper, in particular for what concerns the derivation of frameworks that take into account the definition of local external actions and the introduction of the space and velocity dynamics.

  15. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  16. Theory and experiments in model-based space system anomaly management

    Science.gov (United States)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  17. A multi-species exchange model for fully fluctuating polymer field theory simulations.

    Science.gov (United States)

    Düchs, Dominik; Delaney, Kris T; Fredrickson, Glenn H

    2014-11-07

    Field-theoretic models have been used extensively to study the phase behavior of inhomogeneous polymer melts and solutions, both in self-consistent mean-field calculations and in numerical simulations of the full theory capturing composition fluctuations. The models commonly used can be grouped into two categories, namely, species models and exchange models. Species models involve integrations of functionals that explicitly depend on fields originating both from species density operators and their conjugate chemical potential fields. In contrast, exchange models retain only linear combinations of the chemical potential fields. In the two-component case, development of exchange models has been instrumental in enabling stable complex Langevin (CL) simulations of the full complex-valued theory. No comparable stable CL approach has yet been established for field theories of the species type. Here, we introduce an extension of the exchange model to an arbitrary number of components, namely, the multi-species exchange (MSE) model, which greatly expands the classes of soft material systems that can be accessed by the complex Langevin simulation technique. We demonstrate the stability and accuracy of the MSE-CL sampling approach using numerical simulations of triblock and tetrablock terpolymer melts, and tetrablock quaterpolymer melts. This method should enable studies of a wide range of fluctuation phenomena in multiblock/multi-species polymer blends and composites.

  18. Dual-process models of health-related behaviour and cognition: a review of theory.

    Science.gov (United States)

    Houlihan, S

    2018-03-01

    The aim of this review was to synthesise a spectrum of theories incorporating dual-process models of health-related behaviour. Review of theory, adapted loosely from Cochrane-style systematic review methodology. Inclusion criteria were specified to identify all relevant dual-process models that explain decision-making in the context of decisions made about human health. Data analysis took the form of iterative template analysis (adapted from the conceptual synthesis framework used in other reviews of theory), and in this way theories were synthesised on the basis of shared theoretical constructs and causal pathways. Analysis and synthesis proceeded in turn, instead of moving uni-directionally from analysis of individual theories to synthesis of multiple theories. Namely, the reviewer considered and reconsidered individual theories and theoretical components in generating the narrative synthesis' main findings. Drawing on systematic review methodology, 11 electronic databases were searched for relevant dual-process theories. After de-duplication, 12,198 records remained. Screening of title and abstract led to the exclusion of 12,036 records, after which 162 full-text records were assessed. Of those, 21 records were included in the review. Moving back and forth between analysis of individual theories and the synthesis of theories grouped on the basis of theme or focus yielded additional insights into the orientation of a theory to an individual. Theories could be grouped in part on their treatment of an individual as an irrational actor, as social actor, as actor in a physical environment or as a self-regulated actor. Synthesising identified theories into a general dual-process model of health-related behaviour indicated that such behaviour is the result of both propositional and unconscious reasoning driven by an individual's response to internal cues (such as heuristics, attitude and affect), physical cues (social and physical environmental stimuli) as well as

  19. Two-dimensional sigma models: modelling non-perturbative effects of gauge theories

    International Nuclear Information System (INIS)

    Novikov, V.A.; Shifman, M.A.; Vainshtein, A.I.; Zakharov, V.I.

    1984-01-01

    The review is devoted to a discussion of non-perturbative effects in gauge theories and two-dimensional sigma models. The main emphasis is put on supersymmetric 0(3) sigma model. The instanton-based method for calculating the exact Gell-Mann-Low function and bifermionic condensate is considered in detail. All aspects of the method in simplifying conditions are discussed. The basic points are: the instanton measure from purely classical analysis; a non-renormalization theorem in self-dual external fields; existence of vacuum condensates and their compatibility with supersymmetry

  20. The biopsychosocial model and its potential for a new theory of homeopathy.

    Science.gov (United States)

    Schmidt, Josef M

    2012-04-01

    Since the nineteenth century the theory of conventional medicine has been developed in close alignment with the mechanistic paradigm of natural sciences. Only in the twentieth century occasional attempts were made to (re)introduce the 'subject' into medical theory, as by Thure von Uexküll (1908-2004) who elaborated the so-called biopsychosocial model of the human being, trying to understand the patient as a unit of organic, mental, and social dimensions of life. Although widely neglected by conventional medicine, it is one of the most coherent, significant, and up-to-date models of medicine at present. Being torn between strict adherence to Hahnemann's original conceptualization and alienation caused by contemporary scientific criticism, homeopathy today still lacks a generally accepted, consistent, and definitive theory which would explain in scientific terms its strength, peculiarity, and principles without relapsing into biomedical reductionism. The biopsychosocial model of the human being implies great potential for a new theory of homeopathy, as may be demonstrated with some typical examples. Copyright © 2012. Published by Elsevier Ltd.