WorldWideScience

Sample records for modeling evidence theory

  1. Comparison of evidence theory and Bayesian theory for uncertainty modeling

    International Nuclear Information System (INIS)

    Soundappan, Prabhu; Nikolaidis, Efstratios; Haftka, Raphael T.; Grandhi, Ramana; Canfield, Robert

    2004-01-01

    This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions

  2. Compositional models and conditional independence in evidence theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim; Vejnarová, Jiřina

    2011-01-01

    Roč. 52, č. 3 (2011), s. 316-334 ISSN 0888-613X Institutional research plan: CEZ:AV0Z10750506 Keywords : Evidence theory * Conditional independence * multidimensional models Subject RIV: BA - General Mathematics Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-0370515.pdf

  3. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    Science.gov (United States)

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-18

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  4. Selection Bias in Educational Transition Models: Theory and Empirical Evidence

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads

    Most studies using Mare’s (1980, 1981) seminal model of educational transitions find that the effect of family background decreases across transitions. Recently, Cameron and Heckman (1998, 2001) have argued that the “waning coefficients” in the Mare model are driven by selection on unobserved...... the United States, United Kingdom, Denmark, and the Netherlands shows that when we take selection into account the effect of family background variables on educational transitions is largely constant across transitions. We also discuss several difficulties in estimating educational transition models which...

  5. A review of the evidence linking adult attachment theory and chronic pain: presenting a conceptual model.

    Science.gov (United States)

    Meredith, Pamela; Ownsworth, Tamara; Strong, Jenny

    2008-03-01

    It is now well established that pain is a multidimensional phenomenon, affected by a gamut of psychosocial and biological variables. According to diathesis-stress models of chronic pain, some individuals are more vulnerable to developing disability following acute pain because they possess particular psychosocial vulnerabilities which interact with physical pathology to impact negatively upon outcome. Attachment theory, a theory of social and personality development, has been proposed as a comprehensive developmental model of pain, implicating individual adult attachment pattern in the ontogenesis and maintenance of chronic pain. The present paper reviews and critically appraises studies which link adult attachment theory with chronic pain. Together, these papers offer support for the role of insecure attachment as a diathesis (or vulnerability) for problematic adjustment to pain. The Attachment-Diathesis Model of Chronic Pain developed from this body of literature, combines adult attachment theory with the diathesis-stress approach to chronic pain. The evidence presented in this review, and the associated model, advances our understanding of the developmental origins of chronic pain conditions, with potential application in guiding early pain intervention and prevention efforts, as well as tailoring interventions to suit specific patient needs.

  6. Improving Rolling Bearing Fault Diagnosis by DS Evidence Theory Based Fusion Model

    Directory of Open Access Journals (Sweden)

    Xuemei Yao

    2017-01-01

    Full Text Available Rolling bearing plays an important role in rotating machinery and its working condition directly affects the equipment efficiency. While dozens of methods have been proposed for real-time bearing fault diagnosis and monitoring, the fault classification accuracy of existing algorithms is still not satisfactory. This work presents a novel algorithm fusion model based on principal component analysis and Dempster-Shafer evidence theory for rolling bearing fault diagnosis. It combines the advantages of the learning vector quantization (LVQ neural network model and the decision tree model. Experiments under three different spinning bearing speeds and two different crack sizes show that our fusion model has better performance and higher accuracy than either of the base classification models for rolling bearing fault diagnosis, which is achieved via synergic prediction from both types of models.

  7. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  8. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  9. Linguistic inter-understanding gives evidence in favor of the mental models theory: Induction and comprehension

    Directory of Open Access Journals (Sweden)

    Miguel López-Astorga

    2017-08-01

    Full Text Available Linguistic inter-understanding is a communicative phenomenon that is well known and that has been studied in detail. It basically consists of the fact that an individual speaking a language is able to understand another person speaking a different language, and this without deeply knowing this last language or being able to express himself/herself in it. The phenomenon, which is especially frequent in the case of very similar languages, occurs because of certain inferential processes that can happen in the human mind when people try to interpret information in a distinct language. In this way, the main aim of this paper is to show how such processes are very akin to some of those that the mental models theory attributes to the human reasoning ability, and that hence linguistic inter-understanding can be considered as evidence that this last theory is, at least partially, correct.

  10. Towards theory integration: Threshold model as a link between signal detection theory, fast-and-frugal trees and evidence accumulation theory.

    Science.gov (United States)

    Hozo, Iztok; Djulbegovic, Benjamin; Luan, Shenghua; Tsalatsanis, Athanasios; Gigerenzer, Gerd

    2017-02-01

    Theories of decision making are divided between those aiming to help decision makers in the real, 'large' world and those who study decisions in idealized 'small' world settings. For the most part, these large- and small-world decision theories remain disconnected. We linked the small-world decision theoretic concepts of signal detection theory (SDT) and evidence accumulation theory (EAT) to the threshold model and the large world of heuristic decision making that rely on fast-and-frugal decision trees (FFT). We connected these large- and small-world theories by demonstrating that seemingly different decision-making concepts are actually equivalent. In doing so, we were able (1) to link the threshold model to EAT and FFT, thereby creating decision criteria that take into account both the classification accuracy of FFT and the consequences built in the threshold model; (2) to demonstrate how threshold criteria can be used as a strategy for optimal selection of cues when constructing FFT; and (3) to show that the compensatory strategy expressed in the threshold model can be linked to a non-compensatory FFT approach to decision making. We also showed how construction and performance of FFT depend on having reliable information - the results were highly sensitive to the estimates of benefits and harms of health interventions. We illustrate the practical usefulness of our analysis by describing an FFT we developed for prescribing statins for primary prevention of cardiovascular disease. By linking SDT and EAT to the compensatory threshold model and to non-compensatory heuristic decision making (FFT), we showed how these two decision strategies are ultimately linked within a broader theoretical framework and thereby respond to calls for integrating decision theory paradigms. © 2015 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  11. Whither nursing models? The value of nursing theory in the context of evidence-based practice and multidisciplinary health care.

    Science.gov (United States)

    McCrae, Niall

    2012-01-01

    This paper presents a discussion of the role of nursing models and theory in the modern clinical environment. Models of nursing have had limited success in bridging the gap between theory and practice. Literature on nursing models and theory since the 1950s, from health and social care databases. Arguments against nursing theory are challenged. In the current context of multidisciplinary services and the doctrine of evidence-based practice, a unique theoretical standpoint comprising the art and science of nursing is more relevant than ever. A theoretical framework should reflect the eclectic, pragmatic practice of nursing. Nurse educators and practitioners should embrace theory-based practice as well as evidence-based practice. © 2011 The Author. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  12. Evidence-based practice - an incomplete model of the relationship between theory and professional work.

    Science.gov (United States)

    Hancock, Helen C; Easen, Patrick R

    2004-05-01

    Current day realities of diminishing resources, reductions in spending and organizational change within health care systems have resulted in an increased emphasis on a multidisciplinary team approach to quality patient care. The movement of nursing towards more autonomous practice combined with the current trend towards 'evidence-based practice' in health care demands increased accountability in clinical decision making. This paper focuses on one aspect of nurses' clinical decision making within the demands of evidence-based practice and cardiac surgery. In this field recent advances, combined with increasing demands on health care institutions, have promoted early extubation of post-operative cardiac patients. While this remains a medical role in many institutions, an increasing number of intensive care units now consider it as a nursing role. This paper explores the realities of nurses' clinical decision making through a discussion of current practice in the extubation of patients following cardiac surgery. In addition, it considers the implications of current practice for both nurse education and the continued development of clinical nursing practice. The findings indicate that evidence-based practice appears to be an incomplete model of the relationship between theory and professional work.

  13. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  14. Theory- and Evidence- Based Intervention: Practice-Based Evidence--Integrating Positive Psychology into a Clinical Psychological Assessment and Intervention Model and How to Measure Outcome

    Science.gov (United States)

    Nissen, Poul

    2011-01-01

    In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence- based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning, which includes recognition of the influence of community, school, peers, family and the…

  15. Using the Dynamic Model to develop an evidence-based and theory-driven approach to school improvement

    NARCIS (Netherlands)

    Creemers, B.P.M.; Kyriakides, L.

    2010-01-01

    This paper refers to a dynamic perspective of educational effectiveness and improvement stressing the importance of using an evidence-based and theory-driven approach. Specifically, an approach to school improvement based on the dynamic model of educational effectiveness is offered. The recommended

  16. Theories of learning: models of good practice for evidence-based information skills teaching.

    Science.gov (United States)

    Spring, Hannah

    2010-12-01

    This feature considers models of teaching and learning and how these can be used to support evidence based practice. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.

  17. The Attending Nurse Caring Model: integrating theory, evidence and advanced caring-healing therapeutics for transforming professional practice.

    Science.gov (United States)

    Watson, Jean; Foster, Roxie

    2003-05-01

    This paper presents a proposed model: The Attending Nursing Caring Model (ANCM) as an exemplar for advancing and transforming nursing practice within a reflective, theoretical and evidence-based context. Watson's theory of human caring is used as a guide for integrating theory, evidence and advanced therapeutics in the area of children's pain. The ANCM is offered as a programme for renewing the profession and its professional practices of caring-healing arts and science, during an era of decline, shortages, and crises in care, safety, and hospital and health reform. The ANCM elevates contemporary nursing's caring values, relationships, therapeutics and responsibilities to a higher/deeper order of caring science and professionalism, intersecting with other professions, while sustaining the finest of its heritage and traditions of healing.

  18. Theory- and evidence-based Intervention

    DEFF Research Database (Denmark)

    Nissen, Poul

    2011-01-01

    In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence-based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning which includes recognition of the influe......In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence-based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning which includes recognition...... of the influence of community, school, peers, famely and the functional and structural domains of personality at the behavioural, psenomenological, intra-psychic and biophysical level in a dialectical process. One important aspect of the theoretical basis for presentation of this model is that the child...

  19. A Novel Evaluation Model for Hybrid Power System Based on Vague Set and Dempster-Shafer Evidence Theory

    Directory of Open Access Journals (Sweden)

    Dongxiao Niu

    2012-01-01

    Full Text Available Because clean energy and traditional energy have different advantages and disadvantages, it is of great significance to evaluate comprehensive benefits for hybrid power systems. Based on thorough analysis of important characters on hybrid power systems, an index system including security, economic benefit, environmental benefit, and social benefit is established in this paper. Due to advantages of processing abundant uncertain and fuzzy information, vague set is used to determine the decision matrix. Convert vague decision matrix to real one by vague combination ruleand determine uncertain degrees of different indexes by grey incidence analysis, then the mass functions of different comment set in different indexes are obtained. Information can be fused in accordance with Dempster-Shafer (D-S combination rule and the evaluation result is got by vague set and D-S evidence theory. A simulation of hybrid power system including thermal power, wind power, and photovoltaic power in China is provided to demonstrate the effectiveness and potential of the proposed design scheme. It can be clearly seen that the uncertainties in decision making can be dramatically decreased compared with existing methods in the literature. The actual implementation results illustrate that the proposed index system and evaluation model based on vague set and D-S evidence theory are effective and practical to evaluate comprehensive benefit of hybrid power system.

  20. Market price of risk specifications for affine models: Theory and evidence

    OpenAIRE

    Filipovic, Damir; Cheridito, P.; Kimmel, R. L.

    2007-01-01

    We extend the standard specification of the market price of risk for affine yield models, and apply it to U.S. Treasury data. Our specification often provides better fit, sometimes with very high statistical significance. The improved fit comes from the time-series rather than cross-sectional features of the yield curve. We derive conditions under which our specification does not admit arbitrage opportunities. The extension has extremely strong statistical significance for affine yield models...

  1. Evaluating the adoption of evidence-based practice using Rogers’s diffusion of innovation theory: a model testing study

    Science.gov (United States)

    Mohammadi, Mohammad Mehdi; Poursaberi, Roghayeh; Salahshoor, Mohammad Reza

    2018-01-01

    Background: Despite the emergence and development of evidence-based practice (EBP) in recent years, its adoption continues to be limited. This study used Rogers’s diffusion of innovation theory to identify the factors that advance EBP adoption, determine the process by which such adoption occurs, and develop an EBP adoption model. Methods: This descriptive correlational study with model testing design conducted in 2015.Data were collected from 482 individuals (322 nurses and 160 nursing students) applying a demographic information questionnaire, a standard scale for the perception EBP attributes, an EBP scale, and an individual innovation inventory. The relationships between variables we reexamined by path analysis. Results: The results showed that EBP adoption had a significant positive relationship with individual innovation (r = 0.578, P adoption was influenced by various factors, such as individual innovation, attitude, knowledge, and the perception of EBP attributes. Among these factors, attitude had the greatest effect on EBP adoption. The findings can serve as a guide for the identification of factors that effectively influence EBP adoption. They can also be used as bases for the design of training programs intended to enhance the adoption of EBP. PMID:29423359

  2. Evaluating the adoption of evidence-based practice using Rogers's diffusion of innovation theory: a model testing study.

    Science.gov (United States)

    Mohammadi, Mohammad Mehdi; Poursaberi, Roghayeh; Salahshoor, Mohammad Reza

    2018-01-01

    Background: Despite the emergence and development of evidence-based practice (EBP) in recent years, its adoption continues to be limited. This study used Rogers's diffusion of innovation theory to identify the factors that advance EBP adoption, determine the process by which such adoption occurs, and develop an EBP adoption model. Methods: This descriptive correlational study with model testing design conducted in 2015.Data were collected from 482 individuals (322 nurses and 160 nursing students) applying a demographic information questionnaire, a standard scale for the perception EBP attributes, an EBP scale, and an individual innovation inventory. The relationships between variables we reexamined by path analysis. Results: The results showed that EBP adoption had a significant positive relationship with individual innovation (r = 0.578, P adoption was influenced by various factors, such as individual innovation, attitude, knowledge, and the perception of EBP attributes. Among these factors, attitude had the greatest effect on EBP adoption. The findings can serve as a guide for the identification of factors that effectively influence EBP adoption. They can also be used as bases for the design of training programs intended to enhance the adoption of EBP.

  3. Evaluating the adoption of evidence-based practice using Rogers’s diffusion of innovation theory: a model testing study

    Directory of Open Access Journals (Sweden)

    Mohammad Mehdi Mohammadi

    2018-01-01

    Full Text Available Background: Despite the emergence and development of evidence-based practice (EBP in recent years, its adoption continues to be limited. This study used Rogers’s diffusion of innovation theory to identify the factors that advance EBP adoption, determine the process by which such adoption occurs, and develop an EBP adoption model.Methods: This descriptive correlational study with model testing design conducted in 2015.Data were collected from 482 individuals (322 nurses and 160 nursing students applying a demographic information questionnaire, a standard scale for the perception EBP attributes, an EBP scale, and an individual innovation inventory. The relationships between variables we reexamined by path analysis.Results: The results showed that EBP adoption had a significant positive relationship with individual innovation (r = 0.578, P < 0.001, knowledge (r = 0.657, P < 0.001, attitude (r = 0.623,P < 0.001, and age (r = 0.357, P < 0.001. The findings of path analysis indicated that the goodness of fit indices such as goodness of fit index (GFI = 0.999, comparative fit index (CFI= 0.999, root mean square error of approximation (RMSEA = 0.036 were in the ideal ranges.Knowledge (total effect=0.309, P < 0.001, attitude (total effect = 0.372, P = 0.002, and work experience (total effect=0.321, P = 0.003 had the highest coefficient in the model.Conclusion: The results suggested that EBP adoption was influenced by various factors, such as individual innovation, attitude, knowledge, and the perception of EBP attributes. Among these factors, attitude had the greatest effect on EBP adoption. The findings can serve as a guide for the identification of factors that effectively influence EBP adoption. They can also be used as bases for the design of training programs intended to enhance the adoption of EBP.

  4. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  5. Integrating Norm Activation Model and Theory of Planned Behavior to Understand Sustainable Transport Behavior: Evidence from China

    Directory of Open Access Journals (Sweden)

    Yuwei Liu

    2017-12-01

    Full Text Available With increasing urbanization in China, many cities are facing serious environmental problems due to continuous and substantial increase in automobile transportation. It is becoming imperative to examine effective ways to reduce individual automobile use to facilitate sustainable transportation behavior. Empirical, theory-based research on sustainable transportation in China is limited. In this research, we propose an integrated model based on the norm activation model and the theory of planned behavior by combining normative and rational factors to predict individuals’ intention to reduce car use. Data from a survey of 600 car drivers in China’s three metropolitan areas was used to test the proposed model and hypotheses. Results showed that three variables, perceived norm of car-transport reduction, attitude towards reduction, and perceived behavior control over car-transport reduction, significantly affected the intention to reduce car-transport. Personal norms mediated the relationship between awareness of consequences of car-transport, ascription of responsibility of car-transport, perceived subjective norm for car-transport reduction, and intention to reduce car-transport. The results of this research not only contribute to theory development in the area of sustainable transportation behavior, but also provide a theoretical frame of reference for relevant policy-makers in urban transport management.

  6. Integrating Norm Activation Model and Theory of Planned Behavior to Understand Sustainable Transport Behavior: Evidence from China.

    Science.gov (United States)

    Liu, Yuwei; Sheng, Hong; Mundorf, Norbert; Redding, Colleen; Ye, Yinjiao

    2017-12-18

    With increasing urbanization in China, many cities are facing serious environmental problems due to continuous and substantial increase in automobile transportation. It is becoming imperative to examine effective ways to reduce individual automobile use to facilitate sustainable transportation behavior. Empirical, theory-based research on sustainable transportation in China is limited. In this research, we propose an integrated model based on the norm activation model and the theory of planned behavior by combining normative and rational factors to predict individuals' intention to reduce car use. Data from a survey of 600 car drivers in China's three metropolitan areas was used to test the proposed model and hypotheses. Results showed that three variables, perceived norm of car-transport reduction, attitude towards reduction, and perceived behavior control over car-transport reduction, significantly affected the intention to reduce car-transport. Personal norms mediated the relationship between awareness of consequences of car-transport, ascription of responsibility of car-transport, perceived subjective norm for car-transport reduction, and intention to reduce car-transport. The results of this research not only contribute to theory development in the area of sustainable transportation behavior, but also provide a theoretical frame of reference for relevant policy-makers in urban transport management.

  7. Critical evidence for the prediction error theory in associative learning.

    Science.gov (United States)

    Terao, Kanta; Matsumoto, Yukihisa; Mizunami, Makoto

    2015-03-10

    In associative learning in mammals, it is widely accepted that the discrepancy, or error, between actual and predicted reward determines whether learning occurs. Complete evidence for the prediction error theory, however, has not been obtained in any learning systems: Prediction error theory stems from the finding of a blocking phenomenon, but blocking can also be accounted for by other theories, such as the attentional theory. We demonstrated blocking in classical conditioning in crickets and obtained evidence to reject the attentional theory. To obtain further evidence supporting the prediction error theory and rejecting alternative theories, we constructed a neural model to match the prediction error theory, by modifying our previous model of learning in crickets, and we tested a prediction from the model: the model predicts that pharmacological intervention of octopaminergic transmission during appetitive conditioning impairs learning but not formation of reward prediction itself, and it thus predicts no learning in subsequent training. We observed such an "auto-blocking", which could be accounted for by the prediction error theory but not by other competitive theories to account for blocking. This study unambiguously demonstrates validity of the prediction error theory in associative learning.

  8. Measuring uncertainty within the theory of evidence

    CERN Document Server

    Salicone, Simona

    2018-01-01

    This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...

  9. Facilitated Reflective Performance Feedback: Developing an Evidence- and Theory-Based Model That Builds Relationship, Explores Reactions and Content, and Coaches for Performance Change (R2C2).

    Science.gov (United States)

    Sargeant, Joan; Lockyer, Jocelyn; Mann, Karen; Holmboe, Eric; Silver, Ivan; Armson, Heather; Driessen, Erik; MacLeod, Tanya; Yen, Wendy; Ross, Kathryn; Power, Mary

    2015-12-01

    To develop and conduct feasibility testing of an evidence-based and theory-informed model for facilitating performance feedback for physicians so as to enhance their acceptance and use of the feedback. To develop the feedback model (2011-2013), the authors drew on earlier research which highlights not only the factors that influence giving, receiving, accepting, and using feedback but also the theoretical perspectives which enable the understanding of these influences. The authors undertook an iterative, multistage, qualitative study guided by two recognized research frameworks: the UK Medical Research Council guidelines for studying complex interventions and realist evaluation. Using these frameworks, they conducted the research in four stages: (1) modeling, (2) facilitator preparation, (3) model feasibility testing, and (4) model refinement. They analyzed data, using content and thematic analysis, and used the findings from each stage to inform the subsequent stage. Findings support the facilitated feedback model, its four phases-build relationship, explore reactions, explore content, coach for performance change (R2C2)-and the theoretical perspectives informing them. The findings contribute to understanding elements that enhance recipients' engagement with, acceptance of, and productive use of feedback. Facilitators reported that the model made sense and the phases generally flowed logically. Recipients reported that the feedback process was helpful and that they appreciated the reflection stimulated by the model and the coaching. The theory- and evidence-based reflective R2C2 Facilitated Feedback Model appears stable and helpful for physicians in facilitating their reflection on and use of formal performance assessment feedback.

  10. Some empirical evidence for ecological dissonance theory.

    Science.gov (United States)

    Miller, D I; Verhoek-Miller, N; Giesen, J M; Wells-Parker, E

    2000-04-01

    Using Festinger's cognitive dissonance theory as a model, the extension to Barker's ecological theory, referred to as ecological dissonance theory, was developed. Designed to examine the motivational dynamics involved when environmental systems are in conflict with each other or with cognitive systems, ecological dissonance theory yielded five propositions which were tested in 10 studies. This summary of the studies suggests operationally defined measures of ecological dissonance may correlate with workers' satisfaction with their jobs, involvement with their jobs, alienation from their work, and to a lesser extent, workers' conflict resolution behavior and communication style.

  11. Evidence theory and differential evolution based uncertainty ...

    Indian Academy of Sciences (India)

    Evidence theory and differential evolution based uncertainty quantification for buckling load of semi-rigid jointed frames ... State Key Laboratory for Disaster Reduction in Civil Engineering, Tongji University, Shanghai 200092, China; Research Institute of Structural Engineering and Disaster Reduction, College of Civil ...

  12. Intuitive Expertise: Theories and Empirical Evidence

    Science.gov (United States)

    Harteis, Christian; Billett, Stephen

    2013-01-01

    Intuition has been long seen as an element of effective human performance in demanding tasks (i.e. expertise). But its form, constitutive elements and development remain subject to diverse explanations. This paper discusses these elements and explores theories and empirical evidence about what constitutes intuitive expertise, and offers an account…

  13. Anomalous Evidence, Confidence Change, and Theory Change.

    Science.gov (United States)

    Hemmerich, Joshua A; Van Voorhis, Kellie; Wiley, Jennifer

    2016-08-01

    A novel experimental paradigm that measured theory change and confidence in participants' theories was used in three experiments to test the effects of anomalous evidence. Experiment 1 varied the amount of anomalous evidence to see if "dose size" made incremental changes in confidence toward theory change. Experiment 2 varied whether anomalous evidence was convergent (of multiple types) or replicating (similar finding repeated). Experiment 3 varied whether participants were provided with an alternative theory that explained the anomalous evidence. All experiments showed that participants' confidence changes were commensurate with the amount of anomalous evidence presented, and that larger decreases in confidence predicted theory changes. Convergent evidence and the presentation of an alternative theory led to larger confidence change. Convergent evidence also caused more theory changes. Even when people do not change theories, factors pertinent to the evidence and alternative theories decrease their confidence in their current theory and move them incrementally closer to theory change. Copyright © 2015 Cognitive Science Society, Inc.

  14. Local computations in Dempster-Shafer theory of evidence

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2012-01-01

    Roč. 53, č. 8 (2012), s. 1155-1167 ISSN 0888-613X Grant - others:GA ČR(CZ) GAP403/12/2175 Program:GA Institutional support: RVO:67985556 Keywords : Discrete belief functions * Dempster-Shafer theory * conditional independence * decomposable model Subject RIV: IN - Informatics, Computer Science Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-local computations in dempster–shafer theory of evidence.pdf

  15. Demographic evidence for adaptive theories of aging.

    Science.gov (United States)

    Mitteldorf, J J

    2012-07-01

    Pleiotropic theories for the evolutionary origins of senescence have been ascendant for forty years (see, for example, G. Williams (1957) Evolution, 11, 398-411; T. Kirkwood (1977) Nature, 270, 301-304), and it is not surprising that interpreters of demographic data seek to frame their results in this context. But some of that evidence finds a much more natural explanation in terms of adaptive aging. Here we re-interpret the 1997 results of the Centenarian Study in Boston, which found in their sample of centenarian women an excess of late childbearing. The finding was originally interpreted as a selection effect: a metabolic link between late menopause and longevity. But we demonstrate that this interpretation is statistically strained, and that the data in fact indicate a causal link: bearing a child late in life induces a metabolic response that promotes longevity. This conclusion directly contradicts some pleiotropic theories of aging that postulate a "cost of reproduction", and it supports theories of aging as an adaptive genetic program.

  16. Stock portfolio selection using Dempster–Shafer evidence theory

    Directory of Open Access Journals (Sweden)

    Gour Sundar Mitra Thakur

    2018-04-01

    Full Text Available Markowitz’s return–risk model for stock portfolio selection is based on the historical return data of assets. In addition to the effect of historical return, there are many other critical factors which directly or indirectly influence the stock market. We use the fuzzy Delphi method to identify the critical factors initially. Factors having lower correlation coefficients are finally considered for further consideration. The critical factors and historical data are used to apply Dempster–Shafer evidence theory to rank the stocks. Then, a portfolio selection model that prefers stocks with higher rank is proposed. Illustration is done using stocks under Bombay Stock Exchange (BSE. Simulation is done by Ant Colony Optimization. The performance of the outcome is found satisfactory when compared with recent performance of the assets. Keywords: Stock portfolio selection, Ranking, Dempster–Shafer evidence theory, Ant Colony Optimization, Fuzzy Delphi method

  17. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  18. Epithelial separation theory for post-tonsillectomy secondary hemorrhage: evidence in a mouse model and potential heparin-binding epidermal growth factor-like growth factor therapy.

    Science.gov (United States)

    Beswick, Daniel M; Santa Maria, Chloe; Ayoub, Noel F; Capasso, Robson; Santa Maria, Peter Luke

    2018-02-01

    To provide histological evidence to investigate a theory for post-tonsillectomy secondary hemorrhage (PTH) in a mouse model and to evaluate the potential for heparin-binding epidermal growth factor-like growth factor (HB-EGF) treatment on wound healing in this model. A prospective randomized single-blinded cohort study. A uniform tongue wound was created in 84 mice (day 0). Mice were randomized to HB-EGF (treatment, n = 42) or saline (control, n = 42). In treatment mice, HB-EGF 5 µg/ml was administered intramuscularly into the wound daily (days 0-14). In control mice, normal saline was administered daily. Three mice from each group were sacrificed daily through day 14 and the wounds evaluated histologically by blinded reviewers. Key stages of wound healing, including keratinocyte proliferation and migration, wound contraction, epithelial separation, and neoangiogenesis, are defined with implications for post-tonsillectomy wound healing. Epithelial separation (59 vs. 100%, p = 0.003) and wound reopening (8 vs. 48%, p < 0.001) were reduced with HB-EGF. Epithelial thickness (220 vs. 30 µm, p = 0.04) was greater with HB-EGF. Wound closure (days 4-5 vs. day 6, p = 0.01) occurred earlier with HB-EGF. In healing of oral keratinocytes on muscle epithelial separation secondary to muscle, contraction occurs concurrently with neoangiogenesis in the base of the wound, increasing the risk of hemorrhage. This potentially explains why post-tonsillectomy secondary hemorrhage occurs and its timing. HB-EGF-treated wounds showed greater epithelial thickness, less frequent epithelial separation and wound reopening, and earlier wound closure prior to neovascularization, suggesting that HB-EGF may be a potential preventative therapy for PTH. NA-animal studies or basic research.

  19. Return migration: theory and empirical evidence

    OpenAIRE

    Dustmann, C.; Weiss, Y.

    2007-01-01

    In this paper we discuss forms of migration that are non-permanent. We focus on temporary migrations where the decision to return is taken by the immigrant. These migrations are likely to be frequent, and we provide some evidence for the UK. We then develop a simple model which rationalizes the decision of a migrant to return to his home country, despite a persistently higher wage in the host country. We consider three motives for a temporary migration: Differences in relative prices in host-...

  20. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  1. Alternative banking: theory and evidence from Europe

    Directory of Open Access Journals (Sweden)

    Kurt Von Mettenheim

    2012-12-01

    Full Text Available Since financial liberalization in the 1980s, non-profit maximizing, stakeholder-oriented banks have outperformed private banks in Europe. This article draws on empirical research, banking theory and theories of the firm to explain this apparent anomaly for neo-liberal policy and contemporary market-based banking theory. The realization of competitive advantages by alternative banks (savings banks, cooperative banks and development banks has significant implications for conceptions of bank change, regulation and political economy.

  2. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  3. Book Review: Market Liquidity: Theory, Evidence, and Policy

    DEFF Research Database (Denmark)

    Boscan, Luis

    2014-01-01

    Review of: Market Liquidity: Theory, Evidence, and Policy / by Thierry Foucault, Marco Pagano and Ailsa Röell. Oxford University Press. April 2013.......Review of: Market Liquidity: Theory, Evidence, and Policy / by Thierry Foucault, Marco Pagano and Ailsa Röell. Oxford University Press. April 2013....

  4. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  5. Trade and Turnover: Theory and Evidence

    OpenAIRE

    Carl Davidson; Steven Matusz

    2005-01-01

    Is the pattern of trade correlated with cross-sector differences in job turnover? Theoretically, external shocks feed through to changes in domestic employment and cross-sector differences in turnover give rise to compensating wage differentials, which feed through to output prices. Using two different data sets on turnover, we find strong evidence that normalized US net exports by sector are negatively correlated with job destruction and worker separation rates. Weaker evidence suggests a po...

  6. GARCH Option Valuation: Theory and Evidence

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Jacobs, Kris; Ornthanalai, Chayawat

    for empirical implementation are laid out and we also discuss the links between GARCH and stochastic volatility models. In the appendix we provide Matlab computer code for option pricing via Monte Carlo simulation for nonaffine models as well as Fourier inversion for affine models....

  7. Uncertainty quantification using evidence theory in multidisciplinary design optimization

    International Nuclear Information System (INIS)

    Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh

    2004-01-01

    Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems

  8. Children balance theories and evidence in exploration, explanation, and learning

    OpenAIRE

    Bonawitz, Elizabeth Baraff; van Schijndel, Tessa J.P.; Friel, Daniel; Schulz, Laura E.

    2011-01-01

    We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children, mean: 62 months). Center and Mass Theory children who observed identical evidence explored the block differently depending on their beliefs. When the block was balanced at its geometric center (belief-...

  9. Children Balance Theories and Evidence in Exploration, Explanation, and Learning

    Science.gov (United States)

    Bonawitz, Elizabeth Baraff; van Schijndel, Tessa J. P.; Friel, Daniel; Schulz, Laura

    2012-01-01

    We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children, mean: 62 months). Center and Mass Theory children who…

  10. Collateral and the limits of debt capacity: theory and evidence

    NARCIS (Netherlands)

    Giambona, E.; Mello, A.S.; Riddiough, T.

    2012-01-01

    This paper considers how collateral is used to finance a going concern, and demonstrates with theory and evidence that there are effective limits to debt capacity and the kinds of claims that are issued to deploy that debt capacity. The theory shows that firms with (unobservably) better quality

  11. Physics Without Causality — Theory and Evidence

    Science.gov (United States)

    Shoup, Richard

    2006-10-01

    The principle of cause and effect is deeply rooted in human experience, so much so that it is routinely and tacitly assumed throughout science, even by scientists working in areas where time symmetry is theoretically ingrained, as it is in both classical and quantum physics. Experiments are said to cause their results, not the other way around. In this informal paper, we argue that this assumption should be replaced with a more general notion of mutual influence — bi-directional relations or constraints on joint values of two or more variables. From an analysis based on quantum entropy, it is proposed that quantum measurement is a unitary three-interaction, with no collapse, no fundamental randomness, and no barrier to backward influence. Experimental results suggesting retrocausality are seen frequently in well-controlled laboratory experiments in parapsychology and elsewhere, especially where a random element is included. Certain common characteristics of these experiments give the appearance of contradicting well-established physical laws, thus providing an opportunity for deeper understanding and important clues that must be addressed by any explanatory theory. We discuss how retrocausal effects and other anomalous phenomena can be explained without major injury to existing physical theory. A modified quantum formalism can give new insights into the nature of quantum measurement, randomness, entanglement, causality, and time.

  12. The Confluence Model and Theory.

    Science.gov (United States)

    McCall, Robert B.

    1985-01-01

    Explains that from a prediction standpoint the confluence model is not very efficient. Very modest increments in accuracy are associated with family configuration variables once chronological age is covaried. Suggests that the major postulates of the theory be tested directly, within individuals and with longitudinal data. (Author/AS)

  13. Minisuperspace models in histories theory

    International Nuclear Information System (INIS)

    Anastopoulos, Charis; Savvidou, Ntina

    2005-01-01

    We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context

  14. Inequality, redistribution and growth : Theory and evidence

    NARCIS (Netherlands)

    Haile, D.

    2005-01-01

    From a macro-perspective, the thesis provides a political economic model that analyses the joint determination of inequality, corruption, taxation, education and economic growth in a dynamic environment. It demonstrates how redistributive taxation is affected by the distribution of wealth and

  15. Lifecycle marriage matching: Theory and Evidence

    OpenAIRE

    Aloysius Siow; Eugene Choo

    2007-01-01

    estimated model shows that a concern for accumulating marriage specific capital is quantitatively significant in generating positive assortative matching in spousal ages at marriage, gender differences in spousal ages at marriage, and a preference for early marriage. Gender variations in population supplies due to gender specific mortality rates and entry cohort sizes have offsetting quantitative effects.

  16. Children balance theories and evidence in exploration, explanation, and learning.

    Science.gov (United States)

    Bonawitz, Elizabeth Baraff; van Schijndel, Tessa J P; Friel, Daniel; Schulz, Laura

    2012-06-01

    We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children, mean: 62 months). Center and Mass Theory children who observed identical evidence explored the block differently depending on their beliefs. When the block was balanced at its geometric center (belief-violating to a Mass Theorist, but belief-consistent to a Center Theorist), Mass Theory children explored the block more, and Center Theory children showed the standard novelty preference; when the block was balanced at the center of mass, the pattern of results reversed. The No Theory children showed a novelty preference regardless of evidence. In Experiments 2 and 3, we follow-up on these findings, showing that both Mass and Center Theorists selectively and differentially appeal to auxiliary variables (e.g., a magnet) to explain evidence only when their beliefs are violated. We also show that children use the data to revise their predictions in the absence of the explanatory auxiliary variable but not in its presence. Taken together, these results suggest that children's learning is at once conservative and flexible; children integrate evidence, prior beliefs, and competing causal hypotheses in their exploration, explanation, and learning. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Foundations of compositional model theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2011-01-01

    Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf

  18. Evidence Combination From an Evolutionary Game Theory Perspective.

    Science.gov (United States)

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2016-09-01

    Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.

  19. Measuring Financial Risk using Extreme Value Theory: evidence from Pakistan

    OpenAIRE

    Qayyum, Abdul; Nawaz, Faisal

    2010-01-01

    The purpose of the paper is to show some methods of extreme value theory through analysis of Pakistani financial data. It also introduced the fundamental of extreme value theory as well as practical aspects for estimating and assessing financial models for tail related risk measures.

  20. Five roles for using theory and evidence in the design and testing of behavior change interventions.

    Science.gov (United States)

    Bartholomew, L Kay; Mullen, Patricia Dolan

    2011-01-01

    The prevailing wisdom in the field of health-related behavior change is that well-designed and effective interventions are guided by theory. Using the framework of intervention mapping, we describe and provide examples of how investigators can effectively select and use theory to design, test, and report interventions. We propose five roles for theory and evidence about theories: a) identification of behavior and determinants of behavior related to a specified health problem (i.e., the logic model of the problem); b) explication of a causal model that includes theoretical constructs for producing change in the behavior of interest (i.e., the logic model of change); c) selection of intervention methods and delivery of practical applications to achieve changes in health behavior; d) evaluation of the resulting intervention including theoretical mediating variables; and e) reporting of the active ingredients of the intervention together with the evaluation results. In problem-driven applied behavioral or social science, researchers use one or multiple theories, empiric evidence, and new research, both to assess a problem and to solve or prevent a problem. Furthermore, the theories for description of the problem may differ from the theories for its solution. In an applied approach, the main focus is on solving problems regarding health behavior change and improvement of health outcomes, and the criteria for success are formulated in terms of the problem rather than the theory. Resulting contributions to theory development may be quite useful, but they are peripheral to the problem-solving process.

  1. Children balance theories and evidence in exploration, explanation, and learning

    NARCIS (Netherlands)

    Bonawitz, E.B.; van Schijndel, T.J.P.; Friel, D.; Schulz, L.

    2012-01-01

    We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children,

  2. A theory of evidence for undeclared nuclear activities

    International Nuclear Information System (INIS)

    King, J.L.

    1995-01-01

    The IAEA has recently explored techniques to augment and improve its existing safeguards information systems as part of Program 93 + 2 in order to address the detection of undeclared activities. Effective utilization of information on undeclared activities requires a formulation of the relationship between the information being gathered and the resulting safeguards assurance. The process of safeguards is represented as the gathering of evidence to provide assurance that no undeclared activities take place. It is shown that the analysis of this process can be represented by a theory grounded in the Dempster-Shafer theory of evidence and the concept of possibility. This paper presents the underlying evidence theory required to support a new information system tool for the analysis of information with respect to undeclared activities. The Dempster-Shafer theory serves as the calculus for the combination of diverse sources of evidence, and when applied to safeguards information, provides a basis for interpreting the result of safeguards indicators and measurements -- safeguards assurance

  3. TIM Series: Theory, Evidence and the Pragmatic Manager

    Directory of Open Access Journals (Sweden)

    Steven Muegge

    2008-08-01

    Full Text Available On July 2, 2008, Steven Muegge from Carleton University delivered a presentation entitled "Theory, Evidence and the Pragmatic Manager". This section provides the key messages from the lecture. The scope of this lecture spanned several topics, including management decision making, forecasting and its limitations, the psychology of expertise, and the management of innovation.

  4. Models in cooperative game theory

    CERN Document Server

    Branzei, Rodica; Tijs, Stef

    2008-01-01

    This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.

  5. Lattice models and conformal field theories

    International Nuclear Information System (INIS)

    Saleur, H.

    1988-01-01

    Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied

  6. Halo modelling in chameleon theories

    Energy Technology Data Exchange (ETDEWEB)

    Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  7. Halo modelling in chameleon theories

    International Nuclear Information System (INIS)

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu

    2014-01-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations

  8. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  9. Quiver gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Yagi, Junya

    2015-01-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  10. Optimizing nursing care by integrating theory-driven evidence-based practice.

    Science.gov (United States)

    Pipe, Teri Britt

    2007-01-01

    An emerging challenge for nursing leadership is how to convey the importance of both evidence-based practice (EBP) and theory-driven care in ensuring patient safety and optimizing outcomes. This article describes a specific example of a leadership strategy based on Rosswurm and Larrabee's model for change to EBP, which was effective in aligning the processes of EBP and theory-driven care.

  11. Short-run Exchange-Rate Dynamics: Theory and Evidence

    DEFF Research Database (Denmark)

    Carlson, John A.; Dahl, Christian Møller; Osler, Carol L.

    for designing exchange-rate models. This paper presents an optimizing model of short-run exchange-rate dynamics consistent with both the micro evidence and the macro evidence, the first such model of which we are aware. With respect to microeconomics, the model is consistent with the institutional structure...... of currency markets, it accurately reflects the constraints and objectives faced by the major participants, and it fits key stylized facts concerning returns and order flow. With respect to macroeconomics, the model is consistent with most of the major puzzles that have emerged under floating rates....

  12. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz

    2017-01-01

    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  13. Theory of Self- vs. Externally-Regulated LearningTM: Fundamentals, Evidence, and Applicability

    Science.gov (United States)

    de la Fuente-Arias, Jesús

    2017-01-01

    The Theory of Self- vs. Externally-Regulated LearningTM has integrated the variables of SRL theory, the DEDEPRO model, and the 3P model. This new Theory has proposed: (a) in general, the importance of the cyclical model of individual self-regulation (SR) and of external regulation stemming from the context (ER), as two different and complementary variables, both in combination and in interaction; (b) specifically, in the teaching-learning context, the relevance of different types of combinations between levels of self-regulation (SR) and of external regulation (ER) in the prediction of self-regulated learning (SRL), and of cognitive-emotional achievement. This review analyzes the assumptions, conceptual elements, empirical evidence, benefits and limitations of SRL vs. ERL Theory. Finally, professional fields of application and future lines of research are suggested. PMID:29033872

  14. Minimal Model Theory for Log Surfaces

    OpenAIRE

    Fujino, Osamu

    2012-01-01

    We discuss the log minimal model theory for log surfaces. We show that the log minimal model program, the finite generation of log canonical rings, and the log abundance theorem for log surfaces hold true under assumptions weaker than the usual framework of the log minimal model theory.

  15. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2001-01-01

    In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...... below the saturation pressure. In addition, a tuned f-theory general model delivers accurate modeling of different kinds of light and heavy oils. Thus, the simplicity and stability of the f-theory general models make them a powerful tool for applications such as reservoir simulations, between others. (C...

  16. Dark energy observational evidence and theoretical models

    CERN Document Server

    Novosyadlyj, B; Shtanov, Yu; Zhuk, A

    2013-01-01

    The book elucidates the current state of the dark energy problem and presents the results of the authors, who work in this area. It describes the observational evidence for the existence of dark energy, the methods and results of constraining of its parameters, modeling of dark energy by scalar fields, the space-times with extra spatial dimensions, especially Kaluza---Klein models, the braneworld models with a single extra dimension as well as the problems of positive definition of gravitational energy in General Relativity, energy conditions and consequences of their violation in the presence of dark energy. This monograph is intended for science professionals, educators and graduate students, specializing in general relativity, cosmology, field theory and particle physics.

  17. Qigong in Cancer Care: Theory, Evidence-Base, and Practice.

    Science.gov (United States)

    Klein, Penelope

    2017-01-12

    Background: The purpose of this discussion is to explore the theory, evidence base, and practice of Qigong for individuals with cancer. Questions addressed are: What is qigong? How does it work? What evidence exists supporting its practice in integrative oncology? What barriers to wide-spread programming access exist? Methods: Sources for this discussion include a review of scholarly texts, the Internet, PubMed, field observations, and expert opinion. Results: Qigong is a gentle, mind/body exercise integral within Chinese medicine. Theoretical foundations include Chinese medicine energy theory, psychoneuroimmunology, the relaxation response, the meditation effect, and epigenetics. Research supports positive effects on quality of life (QOL), fatigue, immune function and cortisol levels, and cognition for individuals with cancer. There is indirect, scientific evidence suggesting that qigong practice may positively influence cancer prevention and survival. No one Qigong exercise regimen has been established as superior. Effective protocols do have common elements: slow mindful exercise, easy to learn, breath regulation, meditation, emphasis on relaxation, and energy cultivation including mental intent and self-massage. Conclusions : Regular practice of Qigong exercise therapy has the potential to improve cancer-related QOL and is indirectly linked to cancer prevention and survival. Wide-spread access to quality Qigong in cancer care programming may be challenged by the availability of existing programming and work force capacity.

  18. Qigong in Cancer Care: Theory, Evidence-Base, and Practice

    Directory of Open Access Journals (Sweden)

    Penelope Klein

    2017-01-01

    Full Text Available Background: The purpose of this discussion is to explore the theory, evidence base, and practice of Qigong for individuals with cancer. Questions addressed are: What is qigong? How does it work? What evidence exists supporting its practice in integrative oncology? What barriers to wide-spread programming access exist? Methods: Sources for this discussion include a review of scholarly texts, the Internet, PubMed, field observations, and expert opinion. Results: Qigong is a gentle, mind/body exercise integral within Chinese medicine. Theoretical foundations include Chinese medicine energy theory, psychoneuroimmunology, the relaxation response, the meditation effect, and epigenetics. Research supports positive effects on quality of life (QOL, fatigue, immune function and cortisol levels, and cognition for individuals with cancer. There is indirect, scientific evidence suggesting that qigong practice may positively influence cancer prevention and survival. No one Qigong exercise regimen has been established as superior. Effective protocols do have common elements: slow mindful exercise, easy to learn, breath regulation, meditation, emphasis on relaxation, and energy cultivation including mental intent and self-massage. Conclusions: Regular practice of Qigong exercise therapy has the potential to improve cancer-related QOL and is indirectly linked to cancer prevention and survival. Wide-spread access to quality Qigong in cancer care programming may be challenged by the availability of existing programming and work force capacity.

  19. THEORIES AND MODELS OF FISCAL POLICY

    OpenAIRE

    Alina Georgiana SOLOMON

    2013-01-01

    Within the paper “Theories and models of fiscal policy” we made a presentation of the main fiscal theories elaborated by the renowned representatives of the Physiocratic School, of the classic English School and last, but not least, the neoclassic approach of the aforementioned in the vision of Frank Ramsey and Arthur Laffer. The purpose of the paper is to highlight which was the contribution of the fiscal theories and models formulated by scholars and to what extent they contributed to devel...

  20. Studying emotion theories through connectivity analysis: Evidence from generalized psychophysiological interactions and graph theory.

    Science.gov (United States)

    Huang, Yun-An; Jastorff, Jan; Van den Stock, Jan; Van de Vliet, Laura; Dupont, Patrick; Vandenbulcke, Mathieu

    2018-05-15

    Psychological construction models of emotion state that emotions are variable concepts constructed by fundamental psychological processes, whereas according to basic emotion theory, emotions cannot be divided into more fundamental units and each basic emotion is represented by a unique and innate neural circuitry. In a previous study, we found evidence for the psychological construction account by showing that several brain regions were commonly activated when perceiving different emotions (i.e. a general emotion network). Moreover, this set of brain regions included areas associated with core affect, conceptualization and executive control, as predicted by psychological construction models. Here we investigate directed functional brain connectivity in the same dataset to address two questions: 1) is there a common pathway within the general emotion network for the perception of different emotions and 2) if so, does this common pathway contain information to distinguish between different emotions? We used generalized psychophysiological interactions and information flow indices to examine the connectivity within the general emotion network. The results revealed a general emotion pathway that connects neural nodes involved in core affect, conceptualization, language and executive control. Perception of different emotions could not be accurately classified based on the connectivity patterns from the nodes of the general emotion pathway. Successful classification was achieved when connections outside the general emotion pathway were included. We propose that the general emotion pathway functions as a common pathway within the general emotion network and is involved in shared basic psychological processes across emotions. However, additional connections within the general emotion network are required to classify different emotions, consistent with a constructionist account. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  2. Bureaucratic Minimal Squawk Behavior: Theory and Evidence from Regulatory Agencies

    OpenAIRE

    Clare Leaver

    2009-01-01

    This paper argues that bureaucrats are susceptible to `minimal squawk` behavior. I develop a simple model in which a desire to avoid criticism can prompt, otherwise public-spirited, bureaucrats to behave inefficiently. Decisions are taken to keep interest groups quiet and mistakes out of the public eye. The policy implications of this behavior are at odds with the received view that agencies should be structured to minimise the threat of `capture`. I test between theories of bureaucratic beha...

  3. Testing consumer theory: evidence from a natural field experiment

    OpenAIRE

    Adena, Maja; Huck, Steffen; Rasul, Imran

    2017-01-01

    We present evidence from a natural field experiment designed to shed light on whether individual behavior is consistent with a neoclassical model of utility maximization subject to budget constraints. We do this through the lens of a field experiment on charitable giving. We find that the behavior of at least 80% of individuals, on both the extensive and intensive margins, can be rationalized within a standard neoclassical choice model in which individuals have preferences, defined over own c...

  4. Evidence accumulation as a model for lexical selection.

    Science.gov (United States)

    Anders, R; Riès, S; van Maanen, L; Alario, F X

    2015-11-01

    We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Multilevel selection theory and evidence: a critique of Gardner, 2015.

    Science.gov (United States)

    Goodnight, C J

    2015-09-01

    Gardner (2015) recently developed a model of a 'Genetical Theory of Multilevel Selection, which is a thoughtfully developed, but flawed model. The model's flaws appear to be symptomatic of common misunderstandings of the multi level selection (MLS) literature and the recent quantitative genetic literature. I use Gardner's model as a guide for highlighting how the MLS literature can address the misconceptions found in his model, and the kin selection literature in general. I discuss research on the efficacy of group selection, the roll of indirect genetic effects in affecting the response to selection and the heritability of group-level traits. I also discuss why the Price multilevel partition should not be used to partition MLS, and why contextual analysis and, by association, direct fitness are appropriate for partitioning MLS. Finally, I discuss conceptual issues around questions concerning the level at which fitness is measured, the units of selection, and I present a brief outline of a model of selection in class-structured populations. I argue that the results derived from the MLS research tradition can inform kin selection research and models, and provide insights that will allow researchers to avoid conceptual flaws such as those seen in the Gardner model. © 2015 European Society For Evolutionary Biology.

  6. Neuroanatomical Evidence in Support of the Bilingual Advantage Theory.

    Science.gov (United States)

    Olulade, O A; Jamal, N I; Koo, D S; Perfetti, C A; LaSasso, C; Eden, G F

    2016-07-01

    The "bilingual advantage" theory stipulates that constant selection and suppression between 2 languages results in enhanced executive control (EC). Behavioral studies of EC in bilinguals have employed wide-ranging tasks and report some conflicting results. To avoid concerns about tasks, we employed a different approach, measuring gray matter volume (GMV) in adult bilinguals, reasoning that any EC-associated benefits should manifest as relatively greater frontal GMV. Indeed, Spanish-English-speaking bilinguals exhibited greater bilateral frontal GMV compared with English-speaking monolinguals. Was this observation attributable to the constant selection and inhibition of 2 spoken languages? To answer this question, we drew on bimodal bilinguals of American Sign Language (ASL) and English who, unlike unimodal bilinguals, can simultaneously use both languages and have been shown not to possess the EC advantage. In this group, there was no greater GMV when compared with monolinguals. Together these results provide neuroanatomical evidence in support of the bilingual advantage theory. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Constraint theory multidimensional mathematical model management

    CERN Document Server

    Friedman, George J

    2017-01-01

    Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...

  8. Staircase Models from Affine Toda Field Theory

    CERN Document Server

    Dorey, P; Dorey, Patrick; Ravanini, Francesco

    1993-01-01

    We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.

  9. Do Choice Experiments Generate Reliable Willingness to Pay Estimates Theory and Experimental Evidence

    Science.gov (United States)

    2015-01-01

    paper we set up a three-stage experimental and theoretical framework to investigate strategic behaviour and design induced status quo bias in choice ...25 of subjects making binary choices between alternative snack foods is consistent with an optimizing model of choice with error. Such choice errors...1 Do Choice Experiments Generate Reliable Willingness to Pay Estimates? Theory and Experimental Evidence Katherine Silz Carson Department of

  10. Targeting the Real Exchange Rate; Theory and Evidence

    OpenAIRE

    Carlos A. Végh Gramont; Guillermo Calvo; Carmen Reinhart

    1994-01-01

    This paper presents a theoretical and empirical analysis of policies aimed at setting a more depreciated level of the real exchange rate. An intertemporal optimizing model suggests that, in the absence of changes in fiscal policy, a more depreciated level of the real exchange can only be attained temporarily. This can be achieved by means of higher inflation and/or higher real interest rates, depending on the degree of capital mobility. Evidence for Brazil, Chile, and Colombia supports the mo...

  11. Sentiment Prediction Based on Dempster-Shafer Theory of Evidence

    Directory of Open Access Journals (Sweden)

    Mohammad Ehsan Basiri

    2014-01-01

    Full Text Available Sentiment prediction techniques are often used to assign numerical scores to free-text format reviews written by people in online review websites. In order to exploit the fine-grained structural information of textual content, a review may be considered as a collection of sentences, each with its own sentiment orientation and score. In this manner, a score aggregation method is needed to combine sentence-level scores into an overall review rating. While recent work has concentrated on designing effective sentence-level prediction methods, there remains the problem of finding efficient algorithms for score aggregation. In this study, we investigate different aggregation methods, as well as the cases in which they perform poorly. According to the analysis of existing methods, we propose a new score aggregation method based on the Dempster-Shafer theory of evidence. In the proposed method, we first detect the polarity of reviews using a machine learning approach and then, consider sentence scores as evidence for the overall review rating. The results from two public social web datasets show the higher performance of our method in comparison with existing score aggregation methods and state-of-the-art machine learning approaches.

  12. Reconstructing bidimensional scalar field theory models

    International Nuclear Information System (INIS)

    Flores, Gabriel H.; Svaiter, N.F.

    2001-07-01

    In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)

  13. A course on basic model theory

    CERN Document Server

    Sarbadhikari, Haimanti

    2017-01-01

    This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.

  14. Advances in cognitive theory and therapy: the generic cognitive model.

    Science.gov (United States)

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  15. Conformal theories, integrable models and coadjoint orbits

    International Nuclear Information System (INIS)

    Aratyn, H.; Ferreira, L.A.; Gomes, J.F.; Zimerman, A.H.

    1991-01-01

    We discuss the Kirillov-Kostant method of coadjoint orbits and its applications to the construction of actions invariant under the infinite dimensional Lie groups. The use of these techniques to the study of integrable models is discussed, with the case of the Toda field theories receiving a special attention. As an illustration we derive, using these methods, a geometric WZWN action based on the extended two-loop Kac-Moody algebra. We show that under a Hamiltonian reduction procedure, which respects conformal invariance, we obtain a hierarchy of Toda type field theories, which contain as submodels the Toda Molecule and periodic Toda Lattice theories. (author)

  16. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  17. Graphical Model Theory for Wireless Sensor Networks

    Energy Technology Data Exchange (ETDEWEB)

    Davis, William B.

    2002-12-08

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.

  18. The Behavioral Life-Cycle Theory Of Consumer Behavior: Survey Evidence

    OpenAIRE

    Fred Graham; Alan G. Isaac

    2000-01-01

    We find that survey evidence on faculty pay-cycle choice strongly contradicts the neoclassical theory of consumer behavior. It is more favorable to the behavioral life-cycle theory of Shefrin and Thaler (1988).

  19. Multiple stages of learning in perceptual categorization: evidence and neurocomputational theory.

    Science.gov (United States)

    Cantwell, George; Crossley, Matthew J; Ashby, F Gregory

    2015-12-01

    Virtually all current theories of category learning assume that humans learn new categories by gradually forming associations directly between stimuli and responses. In information-integration category-learning tasks, this purported process is thought to depend on procedural learning implemented via dopamine-dependent cortical-striatal synaptic plasticity. This article proposes a new, neurobiologically detailed model of procedural category learning that, unlike previous models, does not assume associations are made directly from stimulus to response. Rather, the traditional stimulus-response (S-R) models are replaced with a two-stage learning process. Multiple streams of evidence (behavioral, as well as anatomical and fMRI) are used as inspiration for the new model, which synthesizes evidence of multiple distinct cortical-striatal loops into a neurocomputational theory. An experiment is reported to test a priori predictions of the new model that: (1) recovery from a full reversal should be easier than learning new categories equated for difficulty, and (2) reversal learning in procedural tasks is mediated within the striatum via dopamine-dependent synaptic plasticity. The results confirm the predictions of the new two-stage model and are incompatible with existing S-R models.

  20. Motives and chances of firm diversification: theory and empirical evidence

    International Nuclear Information System (INIS)

    Briglauer, W.

    2001-11-01

    It is beyond controversy that the majority of the largest companies in the industrialized countries perform to a certain extent product diversification strategies. Tying up to this finding the underlying work firstly deals with alternative theoretical and empirical definitions of corporate diversification. Subsequently the theoretical part mainly elaborates an industrial economic framework for categorizing motives of firm diversification. Despite of some inevitable degree of arbitrariness, a relatively widespread and sufficient categorization can be presented. With regards to the relevant economic literature most explanations of product diversification can be classified appropriately. Observing diversification activities one would prima facie infer a positive relationship between product diversification and firm performance, but both, theory and empirical evidence, yield ambiguous results. The empirical part provides a list of existing studies, classified according to the theoretical categorization. In an overview some stylised facts are filtered and discussed consecutively. Most notably, it was found that related diversification strategies significantly outperform strategies of unrelated diversification. At the end of the empirical section econometric methods are applied to agricultural and industrial economic (relating to telecommunication markets) data sets. For the agricultural studies a significantly positive relationship between product diversification and firm performance was found. In contrast no significant results were obtained for the telecommunication markets. (author)

  1. Security Theorems via Model Theory

    Directory of Open Access Journals (Sweden)

    Joshua Guttman

    2009-11-01

    Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.

  2. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  3. Supplier-induced demand: reconsidering the theories and new Australian evidence.

    Science.gov (United States)

    Richardson, Jeffrey R J; Peacock, Stuart J

    2006-01-01

    This paper reconsiders the evidence and several of the key arguments associated with the theory of supplier-induced demand (SID). It proposes a new theory to explain how ethical behaviour is consistent with SID. The purpose of a theory of demand and one criterion for the evaluation of a theory is the provision of a plausible explanation for the observed variability in service use. We argue that Australian data are not easily explained by orthodox possible explanation. We also argue that, having revisited the theory of SID, the agency relationship between doctors and patients arises not simply because of asymmetrical information but from an asymmetrical ability and willingness to exercise judgement in the face of uncertainty. It is also argued that the incomplete demand shift that must occur following an increase in the doctor supply is readily explained by the dynamics of market adjustment when market information is incomplete and there is non-collusive professional (and ethical) behaviour by doctors. Empirical evidence of SID from six Australian data sets is presented and discussed. It is argued that these are more easily explained by SID than by conventional demand side variables. We conclude that once the uncertainty of medical decision making and the complexity of medical judgements are taken into account, SID is a more plausible theory of patient and doctor behaviour than the orthodox model of demand and supply. More importantly, SID provides a satisfactory explanation of the observed pattern and change in the demand for Australian medical services, which are not easily explained in the absence of SID.

  4. Theory of chaotic orbital variations confirmed by Cretaceous geological evidence

    Science.gov (United States)

    Ma, Chao; Meyers, Stephen R.; Sageman, Bradley B.

    2017-02-01

    Variations in the Earth’s orbit and spin vector are a primary control on insolation and climate; their recognition in the geological record has revolutionized our understanding of palaeoclimate dynamics, and has catalysed improvements in the accuracy and precision of the geological timescale. Yet the secular evolution of the planetary orbits beyond 50 million years ago remains highly uncertain, and the chaotic dynamical nature of the Solar System predicted by theoretical models has yet to be rigorously confirmed by well constrained (radioisotopically calibrated and anchored) geological data. Here we present geological evidence for a chaotic resonance transition associated with interactions between the orbits of Mars and the Earth, using an integrated radioisotopic and astronomical timescale from the Cretaceous Western Interior Basin of what is now North America. This analysis confirms the predicted chaotic dynamical behaviour of the Solar System, and provides a constraint for refining numerical solutions for insolation, which will enable a more precise and accurate geological timescale to be produced.

  5. A Probabilistic Model of Theory Formation

    Science.gov (United States)

    Kemp, Charles; Tenenbaum, Joshua B.; Niyogi, Sourabh; Griffiths, Thomas L.

    2010-01-01

    Concept learning is challenging in part because the meanings of many concepts depend on their relationships to other concepts. Learning these concepts in isolation can be difficult, but we present a model that discovers entire systems of related concepts. These systems can be viewed as simple theories that specify the concepts that exist in a…

  6. Diagrammatic group theory in quark models

    International Nuclear Information System (INIS)

    Canning, G.P.

    1977-05-01

    A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de

  7. Prospect Theory in the Heterogeneous Agent Model

    Czech Academy of Sciences Publication Activity Database

    Polach, J.; Kukačka, Jiří

    (2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economics OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438. pdf

  8. Application of model search to lattice theory.

    Energy Technology Data Exchange (ETDEWEB)

    Rose, M.; Wilkinson, K.; Mathematics and Computer Science

    2001-08-01

    We have used the first-order model-searching programs MACE and SEM to study various problems in lattice theory. First, we present a case study in which the two programs are used to examine the differences between the stages along the way from lattice theory to Boolean algebra. Second, we answer several questions posed by Norman Megill and Mladen Pavicic on ortholattices and orthomodular lattices. The questions from Megill and Pavicic arose in their study of quantum logics, which are being investigated in connection with proposed computing devices based on quantum mechanics. Previous questions of a similar nature were answered by McCune and MACE in [2].

  9. Models and theory for precompound angular distributions

    Energy Technology Data Exchange (ETDEWEB)

    Blann, M.; Pohl, B.A.; Remington, B.A. (Lawrence Livermore National Lab., CA (USA)); Scobel, W.; Trabandt, M. (Hamburg Univ. (Germany, F.R.). 1. Inst. fuer Experimentalphysik); Byrd, R.C. (Los Alamos National Lab., NM (USA)); Foster, C.C. (Indiana Univ. Cyclotron Facility, Bloomington, IN (USA)); Bonetti, R.; Chiesa, C. (Milan Univ. (Italy). Ist. di Fisica Generale Applicata); Grimes, S.M. (Ohio Univ

    1990-06-06

    We compare angular distributions calculated by folding nucleon- nucleon scattering kernels, using the theory of Feshbach, Kerman and Koonin, and the systematics of Kalbach, with a wide range of data. The data range from (n,xn) at 14 MeV incident energy to (p,xn) at 160 MeV incident energy. The FKK theory works well with one adjustable parameter, the depth of the nucleon-nucleon interaction potential. The systematics work well when normalized to the hybrid model single differential cross section prediction. The nucleon- nucleon scattering approach seems inadequate. 9 refs., 10 figs.

  10. Racial Threat Theory: Assessing the Evidence, Requesting Redesign

    Directory of Open Access Journals (Sweden)

    Cindy Brooks Dollar

    2014-01-01

    Full Text Available Racial threat theory was developed as a way to explain how population composition influences discriminatory social control practices and has become one of the most acknowledged frameworks for explaining racial disparity in criminal justice outcomes. This paper provides a thorough review of racial threat theory and empirical assessments of the theory and demonstrates that while scholars often cite inconsistent support for the theory, empirical discrepancies may be due to insufficient attention to the conceptual complexity of racial threat. I organize and present the following review around 4 forms of state-sanctioned control mechanisms: police expenditures, arrests, sentencing, and capital punishment. Arguing that the pervasiveness of racialization in state controls warrants continued inquiry, I provide suggestions for future scholarship that will help us develop enhanced understanding of how racial threat may be operating.

  11. Key Elasticities in Job Search Theory : International Evidence

    OpenAIRE

    Addison, John T.; Centeno, Mário; Portugal, Pedro

    2004-01-01

    This paper exploits the informational value of search theory, after Lancaster and Chesher (1983), in conjunction with survey data on the unemployed to calculate key reservation wage and duration elasticities for most EU-15 nations.

  12. Introducing Evidence Through Research "Push": Using Theory and Qualitative Methods.

    Science.gov (United States)

    Morden, Andrew; Ong, Bie Nio; Brooks, Lauren; Jinks, Clare; Porcheret, Mark; Edwards, John J; Dziedzic, Krysia S

    2015-11-01

    A multitude of factors can influence the uptake and implementation of complex interventions in health care. A plethora of theories and frameworks recognize the need to establish relationships, understand organizational dynamics, address context and contingency, and engage key decision makers. Less attention is paid to how theories that emphasize relational contexts can actually be deployed to guide the implementation of an intervention. The purpose of the article is to demonstrate the potential role of qualitative research aligned with theory to inform complex interventions. We detail a study underpinned by theory and qualitative research that (a) ensured key actors made sense of the complex intervention at the earliest stage of adoption and (b) aided initial engagement with the intervention. We conclude that using theoretical approaches aligned with qualitative research can provide insights into the context and dynamics of health care settings that in turn can be used to aid intervention implementation. © The Author(s) 2015.

  13. Vertical Integration of Hospitals and Physicians: Economic Theory and Empirical Evidence on Spending and Quality.

    Science.gov (United States)

    Post, Brady; Buchmueller, Tom; Ryan, Andrew M

    2017-08-01

    Hospital-physician vertical integration is on the rise. While increased efficiencies may be possible, emerging research raises concerns about anticompetitive behavior, spending increases, and uncertain effects on quality. In this review, we bring together several of the key theories of vertical integration that exist in the neoclassical and institutional economics literatures and apply these theories to the hospital-physician relationship. We also conduct a literature review of the effects of vertical integration on prices, spending, and quality in the growing body of evidence ( n = 15) to evaluate which of these frameworks have the strongest empirical support. We find some support for vertical foreclosure as a framework for explaining the observed results. We suggest a conceptual model and identify directions for future research. Based on our analysis, we conclude that vertical integration poses a threat to the affordability of health services and merits special attention from policymakers and antitrust authorities.

  14. Predictive modelling of evidence informed teaching

    OpenAIRE

    Zhang, Dell; Brown, C.

    2017-01-01

    In this paper, we analyse the questionnaire survey data collected from 79 English primary schools about the situation of evidence informed teaching, where the evidences could come from research journals or conferences. Specifically, we build a predictive model to see what external factors could help to close the gap between teachers’ belief and behaviour in evidence informed teaching, which is the first of its kind to our knowledge. The major challenge, from the data mining perspective, is th...

  15. The Body Model Theory of Somatosensory Cortex.

    Science.gov (United States)

    Brecht, Michael

    2017-06-07

    I outline a microcircuit theory of somatosensory cortex as a body model serving both for body representation and "body simulation." A modular model of innervated and non-innervated body parts resides in somatosensory cortical layer 4. This body model is continuously updated and compares to an avatar (an animatable puppet) rather than a mere sensory map. Superficial layers provide context and store sensory memories, whereas layer 5 provides motor output and stores motor memories. I predict that layer-6-to-layer-4 inputs initiate body simulations allowing rehearsal and risk assessment of difficult actions, such as jumps. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Effective field theory and the quark model

    International Nuclear Information System (INIS)

    Durand, Loyal; Ha, Phuoc; Jaczko, Gregory

    2001-01-01

    We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections

  17. A Control Theory Model of Smoking.

    Science.gov (United States)

    Bobashev, Georgiy; Holloway, John; Solano, Eric; Gutkin, Boris

    2017-06-01

    We present a heuristic control theory model that describes smoking under restricted and unrestricted access to cigarettes. The model is based on the allostasis theory and uses a formal representation of a multiscale opponent process. The model simulates smoking behavior of an individual and produces both short-term ("loading up" after not smoking for a while) and long-term smoking patterns (e.g., gradual transition from a few cigarettes to one pack a day). By introducing a formal representation of withdrawal- and craving-like processes, the model produces gradual increases over time in withdrawal- and craving-like signals associated with abstinence and shows that after 3 months of abstinence, craving disappears. The model was programmed as a computer application allowing users to select simulation scenarios. The application links images of brain regions that are activated during the binge/intoxication, withdrawal, or craving with corresponding simulated states. The model was calibrated to represent smoking patterns described in peer-reviewed literature; however, it is generic enough to be adapted to other drugs, including cocaine and opioids. Although the model does not mechanistically describe specific neurobiological processes, it can be useful in prevention and treatment practices as an illustration of drug-using behaviors and expected dynamics of withdrawal and craving during abstinence.

  18. Topos models for physics and topos theory

    Energy Technology Data Exchange (ETDEWEB)

    Wolters, Sander, E-mail: s.wolters@math.ru.nl [Radboud Universiteit Nijmegen, Institute for Mathematics, Astrophysics, and Particle Physics (Netherlands)

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  19. Cognitive Attachment Model of Voices: Evidence Base and Future Implications.

    Science.gov (United States)

    Berry, Katherine; Varese, Filippo; Bucci, Sandra

    2017-01-01

    There is a robust association between hearing voices and exposure to traumatic events. Identifying mediating mechanisms for this relationship is key to theories of voice hearing and the development of therapies for distressing voices. This paper outlines the Cognitive Attachment model of Voices (CAV), a theoretical model to understand the relationship between earlier interpersonal trauma and distressing voice hearing. The model builds on attachment theory and well-established cognitive models of voices and argues that attachment and dissociative processes are key psychological mechanisms that explain how trauma influences voice hearing. Following the presentation of the model, the paper will review the current state of evidence regarding the proposed mechanisms of vulnerability to voice hearing and maintenance of voice-related distress. This review will include evidence from studies supporting associations between dissociation and voices, followed by details of our own research supporting the role of dissociation in mediating the relationship between trauma and voices and evidence supporting the role of adult attachment in influencing beliefs and relationships that voice hearers can develop with voices. The paper concludes by outlining the key questions that future research needs to address to fully test the model and the clinical implications that arise from the work.

  20. MODELS AND THE DYNAMICS OF THEORIES

    Directory of Open Access Journals (Sweden)

    Paulo Abrantes

    2007-12-01

    Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.

  1. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    Science.gov (United States)

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Sparse modeling theory, algorithms, and applications

    CERN Document Server

    Rish, Irina

    2014-01-01

    ""A comprehensive, clear, and well-articulated book on sparse modeling. This book will stand as a prime reference to the research community for many years to come.""-Ricardo Vilalta, Department of Computer Science, University of Houston""This book provides a modern introduction to sparse methods for machine learning and signal processing, with a comprehensive treatment of both theory and algorithms. Sparse Modeling is an ideal book for a first-year graduate course.""-Francis Bach, INRIA - École Normale Supřieure, Paris

  3. On the Role of Theory and Evidence in Macroeconomics

    DEFF Research Database (Denmark)

    Juselius, Katarina

    This paper, which is prepared for the Inagural Conference of the Institute for New Economic Thinking in King's College, Cambridge, 8-11 April 2010, questions the preeminence of theory over empirics in economics and argues that empirical econometrics needs to be given a more important and independ...

  4. Taxation of petroleum products: theory and empirical evidence

    International Nuclear Information System (INIS)

    Gupta, S.; Mahler, W.

    1995-01-01

    The domestic taxation of petroleum products is an important source of revenue in most countries. However, there is a wide variation of tax rates on petroleum products across countries, which cannot be explained by economic theory alone. This paper surveys different considerations advanced for taxing petroleum and presents petroleum tax rate data in 120countries. (author)

  5. Condition Evaluation of Storage Equipment Based on Improved D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Zhang Xiao-yu

    2017-01-01

    Full Text Available Assessment and prediction of the storage equipment’s condition is always a difficult aspect in PHM technology. The current Condition evaluation of equipment lacks of the state level, and a single test data can’t reflect the change of equipment’s state. To solve the problem, this paper proposes an evaluation method based on improved D-S evidence theory. Firstly, use analytic hierarchy process (AHP to establish a hierarchical structure model of equipment and divide the qualified state into 4 grades. Then respectively compare the test data with the last test value, historical test mean value and standard value. And the triangular fuzzy function to calculate the index membership degree, combined with D-S evidence theory to fuse information from multiple sources, to achieve such equipment real-time state assessment. Finally, the model is used to a servo mechanism. The result shows that this method has a good performance in condition evaluation for the storage equipment

  6. Assessing landslide susceptibility by applying fuzzy sets, possibility evidence-based theories

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-01-01

    Full Text Available A landslide susceptibility model was developed for the city of Manizales, Colombia; landslides have been the city’s main environmental problem. Fuzzy sets and possibility and evidence-based theories were used to construct the mo-del due to the set of circumstances and uncertainty involved in the modelling; uncertainty particularly concerned the lack of representative data and the need for systematically coordinating subjective information. Susceptibility and the uncertainty were estimated via data processing; the model contained data concerning mass vulnerability and uncer-tainty. Output data was expressed on a map defined by linguistic categories or uncertain labels as having low, me-dium, high and very high susceptibility; this was considered appropriate for representing susceptibility. A fuzzy spec-trum was developed for classifying susceptibility levels according to perception and expert opinion. The model sho-wed levels of susceptibility in the study area, ranging from low to high susceptibility (medium susceptibility being mo-re frequent. This article shows the details concerning systematic data processing by presenting theories and tools regarding uncertainty. The concept of fuzzy parameters is introduced; this is useful in modelling phenomena regar-ding uncertainty, complexity and nonlinear performance, showing that susceptibility modelling can be feasible. The paper also shows the great convenience of incorporating uncertainty into modelling and decision-making. However, quantifying susceptibility is not suitable when modelling identified uncertainty because incorporating model output information cannot be reduced into exact or real numerical quantities when the nature of the variables is particularly uncertain. The latter concept is applicable to risk assessment.

  7. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  8. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  9. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 1: Building a Foundation

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-06-01

    Full Text Available Objective – Part I of this paper aims to create a framework for an emerging theory of evidence based information literacy instruction. In order to ground this framework in existing theory, a holistic perspective views inquiry as a learning process that synthesizes information searching and knowledge building. An interdisciplinary approach is taken to relate user-centric information behavior theory and constructivist learning theory that supports this synthesis. The substantive theories that emerge serve as a springboard for emerging theory. A second objective of this paper is to define evidence based information literacy instruction by assessing the suitability of performance based assessment and action research as tools of evidence based practice.Methods – An historical review of research grounded in user-centered information behavior theory and constructivist learning theory establishes a body of existing substantive theory that supports emerging theory for evidence based information literacy instruction within an information-to-knowledge approach. A focused review of the literature presents supporting research for an evidence based pedagogy that is performance assessment based, i.e., information users are immersed in real-world tasks that include formative assessments. An analysis of the meaning of action research in terms of its purpose and methodology establishes its suitability for structuring an evidence based pedagogy. Supporting research tests a training model for school librarians and educators which integrates performance based assessment, as well as action research. Results – Findings of an historical analysis of information behavior theory and constructivist teaching practices, and a literature review that explores teaching models for evidence based information literacy instruction, point to two elements of evidence based information literacy instruction: the micro level of information searching behavior and the macro level of

  10. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of placing preventive fissure sealants

    Science.gov (United States)

    2010-01-01

    Background Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Methods Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs) in Scotland. Outcomes were behavioural simulation (scenario decision-making), and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model, and knowledge (a non-theoretical construct). Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value Results Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT), timeline acute (CS-SRM), and outcome expectancy (SCT) entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT) and attitude (TPB) entered the equation, together explaining 68% of the variance in intention. Summary The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for identifying factors

  11. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of placing preventive fissure sealants

    Directory of Open Access Journals (Sweden)

    Maclennan Graeme

    2010-04-01

    Full Text Available Abstract Background Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Methods Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs in Scotland. Outcomes were behavioural simulation (scenario decision-making, and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model, and knowledge (a non-theoretical construct. Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value Results Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT, timeline acute (CS-SRM, and outcome expectancy (SCT entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT and attitude (TPB entered the equation, together explaining 68% of the variance in intention. Summary The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for

  12. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of placing preventive fissure sealants.

    Science.gov (United States)

    Bonetti, Debbie; Johnston, Marie; Clarkson, Jan E; Grimshaw, Jeremy; Pitts, Nigel B; Eccles, Martin; Steen, Nick; Thomas, Ruth; Maclennan, Graeme; Glidewell, Liz; Walker, Anne

    2010-04-08

    Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs) in Scotland. Outcomes were behavioural simulation (scenario decision-making), and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model, and knowledge (a non-theoretical construct). Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value. Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT), timeline acute (CS-SRM), and outcome expectancy (SCT) entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT) and attitude (TPB) entered the equation, together explaining 68% of the variance in intention. The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for identifying factors that may predict clinical behaviour

  13. σ-models and string theories

    International Nuclear Information System (INIS)

    Randjbar-Daemi, S.

    1987-01-01

    The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix

  14. Strategic behavior and marriage payments: theory and evidence from Senegal.

    Science.gov (United States)

    Gaspart, Frederic; Platteau, Jean-Philippe

    2010-01-01

    This article proposes an original theory of marriage payments based on insights gained from firsthand information collected in the Senegal River valley. This theory postulates that decisions about the bride-price, which are made by the bride's father, take into account the likely effects of the amount set on the risk of ill-treatment of the wife and the risk of marriage failure. Based on a sequential game with three players (the bride's father, the husband, and the wife) and a matching process, it leads to a number of important predictions that are tested against Senegalese data relating to bride-prices and various characteristics of women. The empirical results confirm that parents behave strategically by keeping bride-prices down so as to reduce the risk of marriage failure for their daughters. Other interesting effects on marriage payments and the probability of separation are also highlighted, stressing the role of the bride's bargaining power in her own family.

  15. Engaging patients in primary care practice transformation: theory, evidence and practice.

    Science.gov (United States)

    Sharma, Anjana E; Grumbach, Kevin

    2017-06-01

    Patient engagement is a fundamental strategy for achieving patient centred care and is receiving increasing attention in primary care reform efforts such as the patient-centred medical home and related care models. Much of the prior published theory and evidence supporting patient engagement has focused on improving engagement in individual care. Much less is understood about engaging patients as partners in practice improvement at the primary care clinic or practice level. We review the historical and policy context for the growing interest in the USA and UK in patient engagement at the primary care practice level, highlight findings from systematic reviews of the research evidence on practice-level patient engagement and discuss practical considerations for implementing patient engagement. We conclude that while there are persuasive ethical and social justice reasons for empowering patient involvement in practice improvement at the clinic level, research conducted to date in primary care provides suggestive but not yet resounding evidence in support of the instrumental triple aim benefit of practice-level patient engagement. We propose a research agenda to better understand the process and outcomes of practice-level patient engagement and its potential advantages to both the practice and the patients and communities served. Better evidence as well as resources to support and incentivize effective and feasible engagement methods are needed to catalyse greater diffusion of practice-level patient engagement in primary care practices. © The Author 2016. Published by Oxford University Press.

  16. Blue Ocean versus Competitive Strategy: Theory and Evidence

    NARCIS (Netherlands)

    A.E. Burke (Andrew); A.J. van Stel (André); A.R. Thurik (Roy)

    2009-01-01

    textabstractBlue ocean strategy seeks to turn strategic management on its head by replacing ‘competitive advantage’ with ‘value innovation’ as the primary goal where firms must create consumer demand and exploit untapped markets. Empirical analysis has been focused on case study evidence and so

  17. Queuing theory models for computer networks

    Science.gov (United States)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  18. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.

    1986-01-01

    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions.......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...

  19. Economic contract theory tests models of mutualism.

    Science.gov (United States)

    Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E

    2010-09-07

    Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.

  20. Improvement of DS Evidence Theory for Multi-Sensor Conflicting Information

    Directory of Open Access Journals (Sweden)

    Fang Ye

    2017-05-01

    Full Text Available A new DS (Dempster-Shafer combination method is presented in this paper. As data detected by a single sensor are characterized by not only fuzziness, but also partial reliability, the development of multi-sensor information fusion becomes extremely indispensable. The DS evidence theory is an effective means of information fusion, which can not only deal with the uncertainty and inconsistency of multi-sensor data, but also handle the inevitably ambiguity and instability under noise or possible interference. However, the application of DS evidence theory has some limitations when multi-sensor data are conflicting. To address this issue, the DS evidence theory is modified in this paper. Adopting the idea of cluster analysis, we firstly introduce the Lance distance function and spectral angle cosine function to revise original evidence separately before the combination of evidence. Then, based on the modifications of original evidence, an improved conflict redistribution strategy is ulteriorly raised to fuse multi-sensor information. Finally, the numerical simulation analyses demonstrate that the improvement of the DS evidence theory available in this paper overcomes the limitations of conventional DS evidence theory, and realizes more reliable fusion with multi-sensor conflicting information compared to the existing methods.

  1. Linear sigma model for multiflavor gauge theories

    Science.gov (United States)

    Meurice, Y.

    2017-12-01

    We consider a linear sigma model describing 2 Nf2 bosons (σ , a0 , η' and π ) as an approximate effective theory for a S U (3 ) local gauge theory with Nf Dirac fermions in the fundamental representation. The model has a renormalizable U (Nf)L⊗U (Nf)R invariant part, which has an approximate O (2 Nf2) symmetry, and two additional terms, one describing the effects of a S U (Nf)V invariant mass term and the other the effects of the axial anomaly. We calculate the spectrum for arbitrary Nf. Using preliminary and published lattice results from the LatKMI collaboration, we found combinations of the masses that vary slowly with the explicit chiral symmetry breaking and Nf. This suggests that the anomaly term plays a leading role in the mass spectrum and that simple formulas such as Mσ2≃(2 /Nf-Cσ)Mη' 2 should apply in the chiral limit. Lattice measurements of Mη'2 and of approximate constants such as Cσ could help in locating the boundary of the conformal window. We show that our calculation can be adapted for arbitrary representations of the gauge group and in particular to the minimal model with two sextets, where similar patterns are likely to apply.

  2. Symmetry Breaking, Unification, and Theories Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasunori

    2009-07-31

    A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.

  3. A matrix model from string field theory

    Directory of Open Access Journals (Sweden)

    Syoji Zeze

    2016-09-01

    Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.

  4. Evidence uncovered: long-term interest rates, monetary policy, and the expectations theory

    OpenAIRE

    Jennifer E. Roush

    2001-01-01

    A large body of literature has failed to find conclusive evidence that the expectations theory of the term structure holds in U.S. data. This paper asks more narrowly whether the theory holds conditional on an exogenous change in monetary policy. We argue that previous work on the expectation theory has failed to sufficiently account for interactions between monetary policy and bond markets in the determination of long and short interest rates. Using methods that directly account for this int...

  5. Brief Report: An Independent Replication and Extension of Psychometric Evidence Supporting the Theory of Mind Inventory

    Science.gov (United States)

    Greenslade, Kathryn J.; Coggins, Truman E.

    2016-01-01

    This study presents an independent replication and extension of psychometric evidence supporting the "Theory of Mind Inventory" ("ToMI"). Parents of 20 children with ASD (4; 1-6; 7 years; months) and 20 with typical development (3; 1-6; 5), rated their child's theory of mind abilities in everyday situations. Other parent report…

  6. The epigenetic side of human adaptation: hypotheses, evidences and theories.

    Science.gov (United States)

    Giuliani, Cristina; Bacalini, Maria Giulia; Sazzini, Marco; Pirazzini, Chiara; Franceschi, Claudio; Garagnani, Paolo; Luiselli, Donata

    2015-01-01

    Epigenetics represents a still unexplored research field in the understanding of micro- and macro-evolutionary mechanisms, as epigenetic changes create phenotypic diversity within both individuals and populations. The purpose of this review is to dissect the landscape of studies focused on DNA methylation, one of the most described epigenetic mechanisms, emphasizing the aspects that could be relevant in human adaptations. Theories and results here considered were collected from the most recent papers published. The matter of DNA methylation inheritance is here described as well as the recent evolutionary theories regarding the role of DNA methylation-and epigenetics in a broader sense-in human evolution. The complex relation between (1) DNA methylation and genetic variability and (2) DNA methylation and the environmental stimuli crucial in shaping genetic and phenotypic variability through the human lineage-such as diet, climate and pathogens exposure-are described. Papers about population epigenetics are also illustrated due to their high relevance in this context. Genetic, epigenetic and phenotypic variations of the species, together with cultural ones, are considerably shaped by a vast range of environmental stimuli, thus representing the foundation of all human bio-cultural adaptations.

  7. Applying psychological theories to evidence-based clinical practice: Identifying factors predictive of managing upper respiratory tract infections without antibiotics

    Directory of Open Access Journals (Sweden)

    Glidewell Elizabeth

    2007-08-01

    Full Text Available Abstract Background Psychological models can be used to understand and predict behaviour in a wide range of settings. However, they have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. The aim of this study was to explore the usefulness of a range of psychological theories to predict health professional behaviour relating to management of upper respiratory tract infections (URTIs without antibiotics. Methods Psychological measures were collected by postal questionnaire survey from a random sample of general practitioners (GPs in Scotland. The outcome measures were clinical behaviour (using antibiotic prescription rates as a proxy indicator, behavioural simulation (scenario-based decisions to managing URTI with or without antibiotics and behavioural intention (general intention to managing URTI without antibiotics. Explanatory variables were the constructs within the following theories: Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-Regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model (SM, and knowledge (a non-theoretical construct. For each outcome measure, multiple regression analysis was used to examine the predictive value of each theoretical model individually. Following this 'theory level' analysis, a 'cross theory' analysis was conducted to investigate the combined predictive value of all significant individual constructs across theories. Results All theories were tested, but only significant results are presented. When predicting behaviour, at the theory level, OLT explained 6% of the variance and, in a cross theory analysis, OLT 'evidence of habitual behaviour' also explained 6%. When predicting behavioural simulation, at the theory level, the proportion of variance explained was: TPB, 31%; SCT, 26%; II, 6%; OLT, 24%. GPs who reported having already decided to change their management to

  8. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of managing upper respiratory tract infections without antibiotics.

    Science.gov (United States)

    Eccles, Martin P; Grimshaw, Jeremy M; Johnston, Marie; Steen, Nick; Pitts, Nigel B; Thomas, Ruth; Glidewell, Elizabeth; Maclennan, Graeme; Bonetti, Debbie; Walker, Anne

    2007-08-03

    Psychological models can be used to understand and predict behaviour in a wide range of settings. However, they have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. The aim of this study was to explore the usefulness of a range of psychological theories to predict health professional behaviour relating to management of upper respiratory tract infections (URTIs) without antibiotics. Psychological measures were collected by postal questionnaire survey from a random sample of general practitioners (GPs) in Scotland. The outcome measures were clinical behaviour (using antibiotic prescription rates as a proxy indicator), behavioural simulation (scenario-based decisions to managing URTI with or without antibiotics) and behavioural intention (general intention to managing URTI without antibiotics). Explanatory variables were the constructs within the following theories: Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-Regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model (SM), and knowledge (a non-theoretical construct). For each outcome measure, multiple regression analysis was used to examine the predictive value of each theoretical model individually. Following this 'theory level' analysis, a 'cross theory' analysis was conducted to investigate the combined predictive value of all significant individual constructs across theories. All theories were tested, but only significant results are presented. When predicting behaviour, at the theory level, OLT explained 6% of the variance and, in a cross theory analysis, OLT 'evidence of habitual behaviour' also explained 6%. When predicting behavioural simulation, at the theory level, the proportion of variance explained was: TPB, 31%; SCT, 26%; II, 6%; OLT, 24%. GPs who reported having already decided to change their management to try to avoid the use of antibiotics made

  9. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  10. Exporting under trade policy uncertainty: Theory and evidence

    OpenAIRE

    Handley, Kyle

    2011-01-01

    Policy commitment and credibility are important for inducing agents to make costly, irreversible investments. Policy uncertainty can delay investment and reduce the response to policy change. I provide theoretical and novel quantitative evidence for these effects by focusing on trade policy, a ubiquitous but often overlooked source of uncertainty, when a firm's cost of export market entry is sunk. While an explicit purpose of the World Trade Organization (WTO) is to secure long term market ac...

  11. Women in the Workplace and Management Practices: Theory and Evidence

    OpenAIRE

    Kato, Takao; Kodama, Naomi

    2017-01-01

    We review recent studies on management practices and their consequences for women in the workplace. First, the High Performance Work System (HPWS) is associated with greater gender diversity in the workplace while there is little evidence that the HPWS reduces the gender pay gap. Second, work-life balance practices with limited face-to-face interactions with coworkers may hamper women’s career advancement. Third, individual incentive linking pay to objective performance may enhance gender div...

  12. Global Sourcing of Heterogeneous Firms: Theory and Evidence

    DEFF Research Database (Denmark)

    Kohler, Wilhelm; Smolka, Marcel

    2015-01-01

    The share of international trade within firm boundaries varies greatly across countries. This column presents new evidence on how the productivity of a firm affects the choice between vertical integration and outsourcing, as well as between foreign and domestic sourcing. The productivity effects...... found in Spanish firm-level data suggest that contractual imperfections distort the sourcing of inputs in the global economy, and that firm boundaries emerge in response to mitigate this distortion....

  13. Entrepreneurship and economic development: Theory, evidence and policy

    OpenAIRE

    Naudé, Wim

    2012-01-01

    This paper provides an overview of the state of the art of the intersection of development economics and entrepreneurship. Given the relative neglect of entrepreneurship by development scholars it deals with (i) recent theoretical insights from the intersection of entrepreneurship and development studies; (ii) the empirical evidence on the relationship between entrepreneurship and development; and (iii) fresh insights for entrepreneurship policy for development that emerges from recent advanc...

  14. Chern-Simons Theory, Matrix Models, and Topological Strings

    International Nuclear Information System (INIS)

    Walcher, J

    2006-01-01

    This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may

  15. An Econometric Validation of Malthusian Theory: Evidence in Nigeria

    Directory of Open Access Journals (Sweden)

    Musa Abdullahi Sakanko

    2018-01-01

    Full Text Available Rising population is an asset, provided, the skills of the workforce are used to the maximum extent. If not appropriately channelized, it can be a liability for a nation. A skilled and hardworking population can emerge as a foundation for a country’s development. This study examines the validity of Malthusian Theory in Nigeria using time series data from 1960 to 2016, employs the ARDL bound test techniques. The result shows that in the long-run, population growth and food production move proportionately, while population growth poses a depleting effect on food production in the short-run, thus validating the incidence of Malthusian impact in Nigerian economy in the short-run. The researcher recommended the government should strategize plans, which will further intensify family planning and birth control measure, compulsory western education and revitalization of the agricultural sector.DOI: 10.150408/sjie.v7i1.6461

  16. Application of Chaos Theory to Psychological Models

    Science.gov (United States)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  17. Propagation of solar disturbances - Theories and models

    Science.gov (United States)

    Wu, S. T.

    1983-01-01

    Recent theoretical developments and construction of several models for the propagation of solar disturbances from the sun and their continuation throughout heliospheric space are discussed. Emphasis centers on physical mechanisms as well as mathematical techniques (i.e., analytical and numerical methods). This outline will lead to a discussion of the state-of-the-art of theoretically based modeling efforts in this area. It is shown that the fundamental theory for the study of propagation of disturbances in heliospheric space is centered around the self-consistent analysis of wave and mass motion within the context of magnetohydrodynamics in which the small scale structures will be modified by kinetic effects. Finally, brief mention is made of some interesting problems for which attention is needed for advancement of the understanding of the physics of large scale propagation of solar disturbances in heliospheric space.

  18. PARFUME Theory and Model basis Report

    Energy Technology Data Exchange (ETDEWEB)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  19. Applying psychological theory to evidence-based clinical practice: identifying factors predictive of taking intra-oral radiographs.

    Science.gov (United States)

    Bonetti, Debbie; Pitts, Nigel B; Eccles, Martin; Grimshaw, Jeremy; Johnston, Marie; Steen, Nick; Glidewell, Liz; Thomas, Ruth; Maclennan, Graeme; Clarkson, Jan E; Walker, Anne

    2006-10-01

    This study applies psychological theory to the implementation of evidence-based clinical practice. The first objective was to see if variables from psychological frameworks (developed to understand, predict and influence behaviour) could predict an evidence-based clinical behaviour. The second objective was to develop a scientific rationale to design or choose an implementation intervention. Variables from the Theory of Planned Behaviour, Social Cognitive Theory, Self-Regulation Model, Operant Conditioning, Implementation Intentions and the Precaution Adoption Process were measured, with data collection by postal survey. The primary outcome was the number of intra-oral radiographs taken per course of treatment collected from a central fee claims database. Participants were 214 Scottish General Dental Practitioners. At the theory level, the Theory of Planned Behaviour explained 13% variance in the number of radiographs taken, Social Cognitive Theory explained 7%, Operant Conditioning explained 8%, Implementation Intentions explained 11%. Self-Regulation and Stage Theory did not predict significant variance in radiographs taken. Perceived behavioural control, action planning and risk perception explained 16% of the variance in number of radiographs taken. Knowledge did not predict the number of radiographs taken. The results suggest an intervention targeting predictive psychological variables could increase the implementation of this evidence-based practice, while influencing knowledge is unlikely to do so. Measures which predicted number of radiographs taken also predicted intention to take radiographs, and intention accounted for significant variance in behaviour (adjusted R(2)=5%: F(1,166)=10.28, ptheory-based approach enabled the creation of a methodology that can be replicated for identifying factors predictive of clinical behaviour and for the design and choice of interventions to modify practice as new evidence emerges.

  20. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  1. Interntional Migration with Heterogeneous Agents: Theory and Evidence

    DEFF Research Database (Denmark)

    Schröder, Philipp J.H.; Brücker, Herbert

    Two puzzling facts of international migration are that only a small share of a sending country's population emigrates and that net migration rates tend to cease over time. This paper addresses these issues in a migration model with heterogeneous agents that features temporary migration....... In equilibrium a positive relation exists between the stock of migrants and the income differential, while the net migration flow becomes zero. Consequently, empirical migration models, estimating net migration flows instead of stocks, may be misspecified. This suspicion appears to be confirmed by our empirical...... investigation of cointegration relationships of flow and stock migration models....

  2. How Often Is the Misfit of Item Response Theory Models Practically Significant?

    Science.gov (United States)

    Sinharay, Sandip; Haberman, Shelby J.

    2014-01-01

    Standard 3.9 of the Standards for Educational and Psychological Testing ([, 1999]) demands evidence of model fit when item response theory (IRT) models are employed to data from tests. Hambleton and Han ([Hambleton, R. K., 2005]) and Sinharay ([Sinharay, S., 2005]) recommended the assessment of practical significance of misfit of IRT models, but…

  3. Statistical polarization in greenhouse gas emissions: Theory and evidence.

    Science.gov (United States)

    Remuzgo, Lorena; Trueba, Carmen

    2017-11-01

    The current debate on climate change is over whether global warming can be limited in order to lessen its impacts. In this sense, evidence of a decrease in the statistical polarization in greenhouse gas (GHG) emissions could encourage countries to establish a stronger multilateral climate change agreement. Based on the interregional and intraregional components of the multivariate generalised entropy measures (Maasoumi, 1986), Gigliarano and Mosler (2009) proposed to study the statistical polarization concept from a multivariate view. In this paper, we apply this approach to study the evolution of such phenomenon in the global distribution of the main GHGs. The empirical analysis has been carried out for the time period 1990-2011, considering an endogenous grouping of countries (Aghevli and Mehran, 1981; Davies and Shorrocks, 1989). Most of the statistical polarization indices showed a slightly increasing pattern that was similar regardless of the number of groups considered. Finally, some policy implications are commented. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. The theories underpinning rational emotive behaviour therapy: where's the supportive evidence?

    Science.gov (United States)

    MacInnes, Douglas

    2004-08-01

    This paper examines the underlying theoretical philosophy of one of the most widely used cognitive behaviour therapies, rational emotive behaviour therapy. It examines whether two central theoretical principles are supported by research evidence: firstly, that irrational beliefs lead to dysfunctional emotions and inferences and that rational beliefs lead to functional emotions and inferences and, secondly, that demand beliefs are the primary core irrational belief. The established criteria for evaluating the efficacy of the theories are detailed and used to evaluate the strength of evidence supporting these two assumptions. The findings indicate there is limited evidence to support these theories. Copyright 2004 Elsevier Ltd.

  5. Multi-Sensor Building Fire Alarm System with Information Fusion Technology Based on D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Qian Ding

    2014-10-01

    Full Text Available Multi-sensor and information fusion technology based on Dempster-Shafer evidence theory is applied in the system of a building fire alarm to realize early detecting and alarming. By using a multi-sensor to monitor the parameters of the fire process, such as light, smoke, temperature, gas and moisture, the range of fire monitoring in space and time is expanded compared with a single-sensor system. Then, the D-S evidence theory is applied to fuse the information from the multi-sensor with the specific fire model, and the fire alarm is more accurate and timely. The proposed method can avoid the failure of the monitoring data effectively, deal with the conflicting evidence from the multi-sensor robustly and improve the reliability of fire warning significantly.

  6. Little Hans and attachment theory: Bowlby's hypothesis reconsidered in light of new evidence from the Freud Archives.

    Science.gov (United States)

    Wakefield, Jerome C

    2007-01-01

    Bowlby (1973), applying attachment theory to Freud's case of Little Hans, hypothesized that Hans's anxiety was a manifestation of anxious attachment. However Bowlby's evidence was modest; Hans was threatened by his mother with abandonment, expressed fear of abandonment prior to symptom onset, and was separated from his mother for a short time a year before. Bowlby's hypothesis is reassessed in light of a systematic review of the case record as well as new evidence from recently derestricted interviews with Hans's father and Hans in the Freud Archives. Bowlby's hypothesis is supported by multiple additional lines of evidence regarding both triggers of separation anxiety preceding the phobia (e.g., a funeral, sibling rivalry, moving, getting his own bedroom) and background factors influencing his working model of attachment (mother's psychopathology, intense marital conflict, multiple suicides in mother's family) that would make him more vulnerable to such anxiety. Bowlby's hypothesis is also placed within the context of subsequent developments in attachment theory.

  7. Managing resistance with multiple pesticide tactics: theory, evidence, and recommendations.

    Science.gov (United States)

    Tabashnik, B E

    1989-10-01

    Sequences, mixtures, rotations, and mosaics are potential strategies for using more than one pesticide to manage pest populations and for slowing the evolution of pesticide resistance. Results from theoretical models suggest that, under certain conditions, mixtures might be especially effective for resistance management. The assumptions of such models, however, are probably not widely applicable. Potential disadvantages associated with mixtures that are usually not considered in modeling studies include disruption of biological control, promotion of resistance in secondary pests, and intense selection for cross-resistance. Results from limited experimental work suggest that pesticide combinations do not consistently suppress resistance development. More thorough evaluation of tactics that seek to optimize benefits of more than one insecticide will require rigorous experiments with the particular pest and pesticide combinations. Because of the difficulty in generalizing results across systems and the potential negative effects of multiple insecticide use, emphasis on minimizing insecticide use is recommended.

  8. International Migration with Heterogeneous Agents: Theory and Evidence

    DEFF Research Database (Denmark)

    Schröder, Philipp J.H.; Brücker, Herbert

    Temporary migration, though empirically relevant, is often ignored in formal models. This paper proposes a migration model with heterogeneous agents and persistent cross country income differentials that features temporary migration. In equilibrium there exists a positive relation between the stock...... of migrants and the income differential, while the net migration flow becomes zero. Consequently, existing empirical migration models, estimating net migration flows, instead of stocks, may be misspecified. This suspicion appears to be confirmed by our investigation of the cointegration relationships...... of German migration stocks and flows since 1967. We find that (i) panel-unit root tests reject the hypothesis that migration flows and the explanatory variables are integrated of the same order, while migration stocks and the explanatory variables are all I(1) variables, and (ii) the hypothesis...

  9. Aligning Theory and Design: The Development of an Online Learning Intervention to Teach Evidence-based Practice for Maximal Reach.

    Science.gov (United States)

    Delagran, Louise; Vihstadt, Corrie; Evans, Roni

    2015-09-01

    Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development.

  10. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    physics pp. 669–673. Anisotropic cosmological models and generalized scalar tensor theory. SUBENOY CHAKRABORTY1,*, BATUL CHANDRA SANTRA2 and ... Anisotropic cosmological models; general scalar tensor theory; inflation. PACS Nos 98.80.Hw; 04.50.+h; 98.80.Cq. 1. Introduction. Brans–Dicke theory [1] (BD ...

  11. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...

  12. Mobile Money, Trade Credit and Economic Development : Theory and Evidence

    NARCIS (Netherlands)

    Beck, T.H.L.; Pamuk, H.; Uras, R.B.; Ramrattan, R.

    2015-01-01

    Using a novel enterprise survey from Kenya (FinAccess Business), we document a strong positive association between the use of mobile money as a method to pay suppliers and access to trade credit. We develop a dynamic general equilibrium model with heterogeneous entrepreneurs, imperfect credit

  13. Financial Structure and Macroeconomic Volatility : Theory and Evidence

    NARCIS (Netherlands)

    Huizinga, H.P.; Zhu, D.

    2006-01-01

    This paper presents a simple model capturing differences between debt and equity finance to examine how financial structure matters for macroeconomic volatility. Debt finance is relatively cheap in the sense that debt holders need to verify relatively few profitability states, but debt finance may

  14. Social capital: theory, evidence, and implications for oral health.

    Science.gov (United States)

    Rouxel, Patrick L; Heilmann, Anja; Aida, Jun; Tsakos, Georgios; Watt, Richard G

    2015-04-01

    In the last two decades, there has been increasing application of the concept of social capital in various fields of public health, including oral health. However, social capital is a contested concept with debates on its definition, measurement, and application. This study provides an overview of the concept of social capital, highlights the various pathways linking social capital to health, and discusses the potential implication of this concept for health policy. An extensive and diverse international literature has examined the relationship between social capital and a range of general health outcomes across the life course. A more limited but expanding literature has also demonstrated the potential influence of social capital on oral health. Much of the evidence in relation to oral health is limited by methodological shortcomings mainly related to the measurement of social capital, cross-sectional study designs, and inadequate controls for confounding factors. Further research using stronger methodological designs should explore the role of social capital in oral health and assess its potential application in the development of oral health improvement interventions. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2015-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  16. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2017-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  17. II. Model building: an electrical theory of control of growth and development in animals, prompted by studies of exogenous magnetic field effects (paper I), and evidence of DNA current conduction, in vitro.

    Science.gov (United States)

    Elson, Edward

    2009-01-01

    A theory of control of cellular proliferation and differentiation in the early development of metazoan systems, postulating a system of electrical controls "parallel" to the processes of molecular biochemistry, is presented. It is argued that the processes of molecular biochemistry alone cannot explain how a developing organism defies a stochastic universe. The demonstration of current flow (charge transfer) along the long axis of DNA through the base-pairs (the "pi-way) in vitro raises the question of whether nature may employ such current flows for biological purposes. Such currents might be too small to be accessible to direct measurement in vivo but conduction has been measured in vitro, and the methods might well be extended to living systems. This has not been done because there is no reasonable model which could stimulate experimentation. We suggest several related, but detachable or independent, models for the biological utility of charge transfer, whose scope admittedly outruns current concepts of thinking about organization, growth, and development in eukaryotic, metazoan systems. The ideas are related to explanations proposed to explain the effects demonstrated on tumors and normal tissues described in Article I (this issue). Microscopic and mesoscopic potential fields and currents are well known at sub-cellular, cellular, and organ systems levels. Not only are such phenomena associated with internal cellular membranes in bioenergetics and information flow, but remarkable long-range fields over tissue interfaces and organs appear to play a role in embryonic development (Nuccitelli, 1992 ). The origin of the fields remains unclear and is the subject of active investigation. We are proposing that similar processes could play a vital role at a "sub-microscopic level," at the level of the chromosomes themselves, and could play a role in organizing and directing fundamental processes of growth and development, in parallel with the more discernible fields and

  18. Aging population and public pensions: Theory and macroeconometric evidence

    Directory of Open Access Journals (Sweden)

    Verbič Miroslav

    2014-01-01

    Full Text Available Rapidly aging population in high-income countries has exerted additional pressure on the sustainability of public pension expenditure. We present a theoretical model of public pension expenditure under endogenous human capital, where the latter facilitates a substantial decrease in equilibrium fertility rate alongside the improvement in life expectancy. We demonstrate how higher life expectancy and human capital endowment facilitate a rise of net replacement rate. We then provide and examine an empirical model of old-age expenditure in a panel of 33 countries for the period 1998-2008. Our results indicate that increases in effective retirement age and total fertility rate would reduce age-related expenditure substantially. While higher net replacement rate would alleviate the risk of old-age poverty, further increases would add considerable pressure on the fiscal sustainability of public pensions.

  19. Gender discrimination and growth: theory and evidence from India

    OpenAIRE

    Berta Esteve-Volart

    2004-01-01

    Gender inequality is an acute and persistent problem, especially in developing countries. This paper argues that gender discrimination is an inefficient practice. We model gender discrimination as the complete exclusion of females from the labor market or as the exclusion of females from managerial positions. The distortions in the allocation of talent between managerial and unskilled positions, and in human capital investment, are analyzed. It is found that both types of discrimination lower...

  20. Dynamic Forecasting Behavior by Analysts: Theory and Evidence

    OpenAIRE

    Ajay Subramanian; Jonathan Clarke

    2004-01-01

    We examine the dynamic forecasting behavior of security analysts in response to their prior performance relative to their peers within a continuous time/multi-period framework. Our model predicts a U-shaped relationship between the boldness of an analyst's forecast, that is, the deviation of her forecast from the consensus and her prior relative performance. In other words, analysts who significantly out perform or under perform their peers issue bolder forecasts than intermediate performers....

  1. THE AGGREGATE IMPLICATIONS OF MACHINE REPLACEMENT: THEORY AND EVIDENCE

    OpenAIRE

    John Haltiwanger; Russell Cooper

    1992-01-01

    The authors study an economy in which producers incur resource costs to replace depreciated machines. The process of costly replacement and depreciation creates endogenous fluctuations in productivity, employment, and output of a single producer. The authors explore the spillover effects of machine replacement on other sectors of the economy and provide conditions for synchronized machine replacement by multiple independent producers. The implications of their model are generally consistent w...

  2. The pipe model theory half a century on: a review.

    Science.gov (United States)

    Lehnebach, Romain; Beyer, Robert; Letort, Véronique; Heuret, Patrick

    2018-01-23

    More than a half century ago, Shinozaki et al. (Shinozaki K, Yoda K, Hozumi K, Kira T. 1964b. A quantitative analysis of plant form - the pipe model theory. II. Further evidence of the theory and its application in forest ecology. Japanese Journal of Ecology14: 133-139) proposed an elegant conceptual framework, the pipe model theory (PMT), to interpret the observed linear relationship between the amount of stem tissue and corresponding supported leaves. The PMT brought a satisfactory answer to two vividly debated problems that were unresolved at the moment of its publication: (1) What determines tree form and which rules drive biomass allocation to the foliar versus stem compartments in plants? (2) How can foliar area or mass in an individual plant, in a stand or at even larger scales be estimated? Since its initial formulation, the PMT has been reinterpreted and used in applications, and has undoubtedly become an important milestone in the mathematical interpretation of plant form and functioning. This article aims to review the PMT by going back to its initial formulation, stating its explicit and implicit properties and discussing them in the light of current biological knowledge and experimental evidence in order to identify the validity and range of applicability of the theory. We also discuss the use of the theory in tree biomechanics and hydraulics as well as in functional-structural plant modelling. Scrutinizing the PMT in the light of modern biological knowledge revealed that most of its properties are not valid as a general rule. The hydraulic framework derived from the PMT has attracted much more attention than its mechanical counterpart and implies that only the conductive portion of a stem cross-section should be proportional to the supported foliage amount rather than the whole of it. The facts that this conductive portion is experimentally difficult to measure and varies with environmental conditions and tree ontogeny might cause the commonly

  3. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....

  4. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  5. Irreducible integrable theories form tensor products of conformal models

    International Nuclear Information System (INIS)

    Mathur, S.D.; Warner, N.P.

    1991-01-01

    By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)

  6. Intervention planning for a digital intervention for self-management of hypertension: a theory-, evidence- and person-based approach.

    Science.gov (United States)

    Band, Rebecca; Bradbury, Katherine; Morton, Katherine; May, Carl; Michie, Susan; Mair, Frances S; Murray, Elizabeth; McManus, Richard J; Little, Paul; Yardley, Lucy

    2017-02-23

    This paper describes the intervention planning process for the Home and Online Management and Evaluation of Blood Pressure (HOME BP), a digital intervention to promote hypertension self-management. It illustrates how a Person-Based Approach can be integrated with theory- and evidence-based approaches. The Person-Based Approach to intervention development emphasises the use of qualitative research to ensure that the intervention is acceptable, persuasive, engaging and easy to implement. Our intervention planning process comprised two parallel, integrated work streams, which combined theory-, evidence- and person-based elements. The first work stream involved collating evidence from a mixed methods feasibility study, a systematic review and a synthesis of qualitative research. This evidence was analysed to identify likely barriers and facilitators to uptake and implementation as well as design features that should be incorporated in the HOME BP intervention. The second work stream used three complementary approaches to theoretical modelling: developing brief guiding principles for intervention design, causal modelling to map behaviour change techniques in the intervention onto the Behaviour Change Wheel and Normalisation Process Theory frameworks, and developing a logic model. The different elements of our integrated approach to intervention planning yielded important, complementary insights into how to design the intervention to maximise acceptability and ease of implementation by both patients and health professionals. From the primary and secondary evidence, we identified key barriers to overcome (such as patient and health professional concerns about side effects of escalating medication) and effective intervention ingredients (such as providing in-person support for making healthy behaviour changes). Our guiding principles highlighted unique design features that could address these issues (such as online reassurance and procedures for managing concerns). Causal

  7. The Economic Importance of Financial Literacy: Theory and Evidence.

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S

    2014-03-01

    This paper undertakes an assessment of a rapidly growing body of economic research on financial literacy. We start with an overview of theoretical research which casts financial knowledge as a form of investment in human capital. Endogenizing financial knowledge has important implications for welfare as well as policies intended to enhance levels of financial knowledge in the larger population. Next, we draw on recent surveys to establish how much (or how little) people know and identify the least financially savvy population subgroups. This is followed by an examination of the impact of financial literacy on economic decision-making in the United States and elsewhere. While the literature is still young, conclusions may be drawn about the effects and consequences of financial illiteracy and what works to remedy these gaps. A final section offers thoughts on what remains to be learned if researchers are to better inform theoretical and empirical models as well as public policy.

  8. Remembering and Voting: Theory and Evidence from Amnesic Patients.

    Science.gov (United States)

    Coronel, Jason C; Duff, Melissa C; Warren, David E; Federmeier, Kara D; Gonsalves, Brian D; Tranel, Daniel; Cohen, Neal J

    2012-10-01

    One of the most prominent claims to emerge from the field of public opinion is that citizens can vote for candidates whose issue positions best reflect their own beliefs even when they cannot remember previously learned stances associated with the candidates. The current experiment provides a unique and powerful examination of this claim by determining whether individuals with profound amnesia, whose severe memory impairments prevent them from remembering specific issue information associated with any particular candidate, can vote for candidates whose issue positions come closest to their own political views. We report here that amnesic patients, despite not being able to remember any issue information, consistently voted for candidates with favored political positions. Thus, sound voting decisions do not require recall or recognition of previously learned associations between candidates and their issue positions. This result supports a multiple memory systems model of political decision making.

  9. Taking Root: a grounded theory on evidence-based nursing implementation in China.

    Science.gov (United States)

    Cheng, L; Broome, M E; Feng, S; Hu, Y

    2017-08-02

    Evidence-based nursing is widely recognized as the critical foundation for quality care. To develop a middle-range theory on the process of evidence-based nursing implementation in Chinese context. A grounded theory study using unstructured in-depth individual interviews was conducted with 56 participants who were involved in 24 evidence-based nursing implementation projects in Mainland China from September 2015 to September 2016. A middle-range grounded theory of 'Taking Root' was developed. The theory describes the evidence implementation process consisting of four components (driving forces, process, outcome, sustainment/regression), three approaches (top-down, bottom-up and outside-in), four implementation strategies (patient-centred, nurses at the heart of change, reaching agreement, collaboration) and two patterns (transformational and adaptive implementation). Certain perspectives may have not been captured, as the retrospective nature of the interviewing technique did not allow for 'real-time' assessment of the actual implementation process. The transferability of the findings requires further exploration as few participants with negative experiences were recruited. This is the first study that explored evidence-based implementation process, strategies, approaches and patterns in the Chinese nursing practice context to inform international nursing and health policymaking. The theory of Taking Root described various approaches to evidence implementation and how the implementation can be transformational for the nurses and the setting in which they work. Nursing educators, managers and researchers should work together to improve nurses' readiness for evidence implementation. Healthcare systems need to optimize internal mechanisms and external collaborations to promote nursing practice in line with evidence and achieve clinical outcomes and sustainability. © 2017 International Council of Nurses.

  10. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... in the economic model is related to econometric concepts of exogeneity. The economic equilibrium corresponds to the so-called long-run value (Johansen 2005), the long-run impact matrix, C; captures the comparative statics and the exogenous variables are the common trends. The adjustment parameters of the CVAR...

  11. Autonomous mathematical models: constructing theories of metabolic control.

    Science.gov (United States)

    Donaghy, Josephine

    2013-01-01

    This paper considers how the relationship between mathematical models and theories in biology may change over time, on the basis of a historical analysis of the development of a mathematical model of metabolism, metabolic control analysis, and its relationship to theories of metabolic control. I argue that one can distinguish two ways of characterising the relationship between models and theories, depending on the stage of model and/or theory development that one is considering: partial independence and autonomy. Partial independence describes a model's relationship with existing theory, thus referring to relationships that have already been established between model and theory during model construction. By contrast, autonomy is a feature of relationships which may become established between model and theory in the future, and is expressed by a model's open ended role in constructing emerging theory. These characteristics have often been conflated by existing philosophical accounts, partly because they can only be identified and analysed when adopting a historical perspective on scientific research. Adopting a clear distinction between partial independence and autonomy improves philosophical insight into the changing relationship between models and theories.

  12. First and second order approximate reliability analysis methods using evidence theory

    International Nuclear Information System (INIS)

    Zhang, Z.; Jiang, C.; Wang, G.G.; Han, X.

    2015-01-01

    The first order approximate reliability method (FARM) and second order approximate reliability method (SARM) are formulated based on evidence theory in this paper. The proposed methods can significantly improve the computational efficiency for evidence-theory-based reliability analysis, while generally provide sufficient precision. First, the most probable focal element (MPFE), an important concept as the most probable point (MPP) in probability-theory-based reliability analysis, is searched using a uniformity approach. Subsequently, FARM approximates the limit-state function around the MPFE using the linear Taylor series, while SARM approximates it using the quadratic Taylor series. With the first and second order approximations, the reliability interval composed of the belief measure and the plausibility measure is efficiently obtained for FARM and SARM, respectively. Two simple problems with explicit expressions and one engineering application of vehicle frontal impact are presented to demonstrate the effectiveness of the proposed methods. - Highlights: • The first order approximate reliability method using evidence theory is proposed. • The second order approximate reliability method using evidence theory is proposed. • The proposed methods can significantly improve the computational efficiency. • The proposed methods can provide sufficient accuracy for general engineering problems

  13. A Biblical-Theological Model of Cognitive Dissonance Theory: Relevance for Christian Educators

    Science.gov (United States)

    Bowen, Danny Ray

    2012-01-01

    The purpose of this content analysis research was to develop a biblical-theological model of Cognitive Dissonance Theory applicable to pedagogy. Evidence of cognitive dissonance found in Scripture was used to infer a purpose for the innate drive toward consonance. This inferred purpose was incorporated into a model that improves the descriptive…

  14. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster–Shafer Evidence Theory

    Science.gov (United States)

    Cao, Ying; Yang, Lin; He, Zichang

    2017-01-01

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven’t taken into account the processing of uncertain information. Therefore, this paper proposes a time–space domain information fusion method based on Dempster–Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result. PMID:28846629

  15. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster-Shafer Evidence Theory.

    Science.gov (United States)

    Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang

    2017-08-28

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.

  16. Decision-Making Algorithm for Multisensor Fusion Based on Grey Relation and DS Evidence Theory

    Directory of Open Access Journals (Sweden)

    Fang Ye

    2016-01-01

    Full Text Available Decision-making algorithm, as the key technology for uncertain data fusion, is the core to obtain reasonable multisensor information fusion results. DS evidence theory is a typical and widely applicable decision-making method. However, DS evidence theory makes decisions without considering the sensors’ difference, which may lead to illogical results. In this paper, we present a novel decision-making algorithm for uncertain fusion based on grey relation and DS evidence theory. The proposed algorithm comprehensively takes consideration of sensor’s credibility and evidence’s overall discriminability, which can solve the uncertainty problems caused by inconsistence of sensors themselves and complexity of monitoring environment and simultaneously ensure the validity and accuracy of fusion results. The innovative decision-making algorithm firstly obtains the sensor’s credibility through the introduction of grey relation theory and then defines two impact factors as sensor’s credibility and evidence’s overall discriminability according to the focal element analyses and evidence’s distance analysis, respectively; after that, it uses the impact factors to modify the evidences and finally gets more reasonable and effective results through DS combination rule. Simulation results and analyses demonstrate that the proposed algorithm can overcome the trouble caused by large evidence conflict and one-vote veto, which indicates that it can improve the ability of target judgment and enhance precision of uncertain data fusion. Thus the novel decision-making method has a certain application value.

  17. Using the theory of planned behaviour to predict nurses' intention to integrate research evidence into clinical decision-making.

    Science.gov (United States)

    Côté, Françoise; Gagnon, Johanne; Houme, Philippe Kouffé; Abdeljelil, Anis Ben; Gagnon, Marie-Pierre

    2012-10-01

    Using an extended theory of planned behaviour, this article is a report of a study to identify the factors that influence nurses' intention to integrate research evidence into their clinical decision-making. Health professionals are increasingly asked to adopt evidence-based practice. The integration of research evidence in nurses' clinical decision-making would have an important impact on the quality of care provided for patients. Despite evidence supporting this practice and the availability of high quality research in the field of nursing, the gap between research and practice is still present. A predictive correlational study. A total of 336 nurses working in a university hospital participated in this research. Data were collected in February and March 2008 by means of a questionnaire based on an extension of the theory of planned behaviour. Descriptive statistics of the model variables, Pearson correlations between all the variables and multiple linear regression analysis were performed. Nurses' intention to integrate research findings into clinical decision-making can be predicted by moral norm, normative beliefs, perceived behavioural control and past behaviour. The moral norm is the most important predictor. Overall, the final model explains 70% of the variance in nurses' intention. The present study supports the use of an extended psychosocial theory for identifying the determinants of nurses' intention to integrate research evidence into their clinical decision-making. Interventions that focus on increasing nurses' perceptions that using research is their responsibility for ensuring good patient care and providing a supportive environment could promote an evidence-based nursing practice. © 2012 Blackwell Publishing Ltd.

  18. Theory and modeling of active brazing.

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  19. Domain Theory, Its Models and Concepts

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt

    2014-01-01

    Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contribution...

  20. Information Theory: a Multifaceted Model of Information

    Directory of Open Access Journals (Sweden)

    Mark Burgin

    2003-06-01

    Full Text Available A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by Shannon’s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases.

  1. An intervention modelling experiment to change GPs' intentions to implement evidence-based practice: using theory-based interventions to promote GP management of upper respiratory tract infection without prescribing antibiotics #2.

    Science.gov (United States)

    Hrisos, Susan; Eccles, Martin; Johnston, Marie; Francis, Jill; Kaner, Eileen F S; Steen, Nick; Grimshaw, Jeremy

    2008-01-14

    Psychological theories of behaviour may provide a framework to guide the design of interventions to change professional behaviour. Behaviour change interventions, designed using psychological theory and targeting important motivational beliefs, were experimentally evaluated for effects on the behavioural intention and simulated behaviour of GPs in the management of uncomplicated upper respiratory tract infection (URTI). The design was a 2 x 2 factorial randomised controlled trial. A postal questionnaire was developed based on three theories of human behaviour: Theory of Planned Behaviour; Social Cognitive Theory and Operant Learning Theory. The beliefs and attitudes of GPs regarding the management of URTI without antibiotics and rates of prescribing on eight patient scenarios were measured at baseline and post-intervention. Two theory-based interventions, a "graded task" with "action planning" and a "persuasive communication", were incorporated into the post-intervention questionnaire. Trial groups were compared using co-variate analyses. Post-intervention questionnaires were returned for 340/397 (86%) GPs who responded to the baseline survey. Each intervention had a significant effect on its targeted behavioural belief: compared to those not receiving the intervention GPs completing Intervention 1 reported stronger self-efficacy scores (Beta = 1.41, 95% CI: 0.64 to 2.25) and GPs completing Intervention 2 had more positive anticipated consequences scores (Beta = 0.98, 95% CI = 0.46 to 1.98). Intervention 2 had a significant effect on intention (Beta = 0.90, 95% CI = 0.41 to 1.38) and simulated behaviour (Beta = 0.47, 95% CI = 0.19 to 0.74). GPs' intended management of URTI was significantly influenced by their confidence in their ability to manage URTI without antibiotics and the consequences they anticipated as a result of doing so. Two targeted behaviour change interventions differentially affected these beliefs. One intervention also significantly enhanced GPs

  2. An intervention modelling experiment to change GPs' intentions to implement evidence-based practice: using theory-based interventions to promote GP management of upper respiratory tract infection without prescribing antibiotics #2

    Directory of Open Access Journals (Sweden)

    Kaner Eileen FS

    2008-01-01

    Full Text Available Abstract Background Psychological theories of behaviour may provide a framework to guide the design of interventions to change professional behaviour. Behaviour change interventions, designed using psychological theory and targeting important motivational beliefs, were experimentally evaluated for effects on the behavioural intention and simulated behaviour of GPs in the management of uncomplicated upper respiratory tract infection (URTI. Methods The design was a 2 × 2 factorial randomised controlled trial. A postal questionnaire was developed based on three theories of human behaviour: Theory of Planned Behaviour; Social Cognitive Theory and Operant Learning Theory. The beliefs and attitudes of GPs regarding the management of URTI without antibiotics and rates of prescribing on eight patient scenarios were measured at baseline and post-intervention. Two theory-based interventions, a "graded task" with "action planning" and a "persuasive communication", were incorporated into the post-intervention questionnaire. Trial groups were compared using co-variate analyses. Results Post-intervention questionnaires were returned for 340/397 (86% GPs who responded to the baseline survey. Each intervention had a significant effect on its targeted behavioural belief: compared to those not receiving the intervention GPs completing Intervention 1 reported stronger self-efficacy scores (Beta = 1.41, 95% CI: 0.64 to 2.25 and GPs completing Intervention 2 had more positive anticipated consequences scores (Beta = 0.98, 95% CI = 0.46 to 1.98. Intervention 2 had a significant effect on intention (Beta = 0.90, 95% CI = 0.41 to 1.38 and simulated behaviour (Beta = 0.47, 95% CI = 0.19 to 0.74. Conclusion GPs' intended management of URTI was significantly influenced by their confidence in their ability to manage URTI without antibiotics and the consequences they anticipated as a result of doing so. Two targeted behaviour change interventions differentially affected

  3. An Evolutionary Game Theory Model of Spontaneous Brain Functioning.

    Science.gov (United States)

    Madeo, Dario; Talarico, Agostino; Pascual-Leone, Alvaro; Mocenni, Chiara; Santarnecchi, Emiliano

    2017-11-22

    Our brain is a complex system of interconnected regions spontaneously organized into distinct networks. The integration of information between and within these networks is a continuous process that can be observed even when the brain is at rest, i.e. not engaged in any particular task. Moreover, such spontaneous dynamics show predictive value over individual cognitive profile and constitute a potential marker in neurological and psychiatric conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN), able to capture brain's interregional dynamics by balancing emulative and non-emulative attitudes among brain regions. This results in the net behavior of nodes composing resting-state networks identified using functional magnetic resonance imaging (fMRI), determining their moment-to-moment level of activation and inhibition as expressed by positive and negative shifts in BOLD fMRI signal. By spontaneously generating low-frequency oscillatory behaviors, the EGN model is able to mimic functional connectivity dynamics, approximate fMRI time series on the basis of initial subset of available data, as well as simulate the impact of network lesions and provide evidence of compensation mechanisms across networks. Results suggest evolutionary game theory on networks as a new potential framework for the understanding of human brain network dynamics.

  4. Big bang models in string theory

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Theoretische Natuurkunde, Vrije Universiteit Brussel and The International Solvay Institutes Pleinlaan 2, B-1050 Brussels (Belgium)

    2006-11-07

    These proceedings are based on lectures delivered at the 'RTN Winter School on Strings, Supergravity and Gauge Theories', CERN, 16-20 January 2006. The school was mainly aimed at PhD students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne orbifold and the matrix big bang.

  5. The Current Evidence for Hayek’s Cultural Group Selection Theory

    Directory of Open Access Journals (Sweden)

    Brad Lowell Stone

    2010-12-01

    Full Text Available In this article I summarize Friedrich Hayek’s cultural group selection theory and describe the evidence gathered by current cultural group selection theorists within the behavioral and social sciences supporting Hayek’s main assertions. I conclude with a few comments on Hayek and libertarianism.

  6. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  7. Theories, models and urban realities. From New York to Kathmandu

    OpenAIRE

    Román Rodríguez González

    2004-01-01

    At the beginning of the 21st century, there are various social theories that speak of global changes in the history of human civilization. Urban models have been through obvious changes throughout the last century according to the important transformation that are pro-posed by previous general theories. Nevertheless global diversity contradicts the generaliza-tion of these theories and models. From our own simple observations and reflections we arrive at conclusions that distance themselves f...

  8. Spatial data modelling and maximum entropy theory

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2005-01-01

    Roč. 51, č. 2 (2005), s. 80-83 ISSN 0139-570X Institutional research plan: CEZ:AV0Z10750506 Keywords : spatial data classification * distribution function * error distribution Subject RIV: BD - Theory of Information

  9. Electroweak theory and the Standard Model

    CERN Multimedia

    CERN. Geneva; Giudice, Gian Francesco

    2004-01-01

    There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.

  10. Statistical Learning Theory: Models, Concepts, and Results

    OpenAIRE

    von Luxburg, Ulrike; Schoelkopf, Bernhard

    2008-01-01

    Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.

  11. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  12. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  13. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  14. Glass Durability Modeling, Activated Complex Theory (ACT)

    International Nuclear Information System (INIS)

    CAROL, JANTZEN

    2005-01-01

    atomic ratios is shown to represent the structural effects of the glass on the dissolution and the formation of activated complexes in the glass leached layer. This provides two different methods by which a linear glass durability model can be formulated. One based on the quasi- crystalline mineral species in a glass and one based on cation ratios in the glass: both are related to the activated complexes on the surface by the law of mass action. The former would allow a new Thermodynamic Hydration Energy Model to be developed based on the hydration of the quasi-crystalline mineral species if all the pertinent thermodynamic data were available. Since the pertinent thermodynamic data is not available, the quasi-crystalline mineral species and the activated complexes can be related to cation ratios in the glass by the law of mass action. The cation ratio model can, thus, be used by waste form producers to formulate durable glasses based on fundamental structural and activated complex theories. Moreover, glass durability model based on atomic ratios simplifies HLW glass process control in that the measured ratios of only a few waste components and glass formers can be used to predict complex HLW glass performance with a high degree of accuracy, e.g. an R 2 approximately 0.97

  15. Testing static tradeoff theory against pecking order models of capital ...

    African Journals Online (AJOL)

    We test two models with the purpose of finding the best empirical explanation for corporate financing choice of a cross section of 27 Nigerian quoted companies. The models were developed to represent the Static tradeoff Theory and the Pecking order Theory of capital structure with a view to make comparison between ...

  16. A Quantitative Causal Model Theory of Conditional Reasoning

    Science.gov (United States)

    Fernbach, Philip M.; Erb, Christopher D.

    2013-01-01

    The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…

  17. A review of organizational buyer behaviour models and theories ...

    African Journals Online (AJOL)

    Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...

  18. Non-static plane symmetric cosmological model in Wesson's theory

    Indian Academy of Sciences (India)

    ] scale invariant theory of gravitation with a time-dependent gauge function is investigated. The false vacuum model of the universe is constructed and some physical properties of the model are discussed.

  19. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto

    2008-01-01

    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  20. Bayesian Model Selection in Geophysics: The evidence

    Science.gov (United States)

    Vrugt, J. A.

    2016-12-01

    Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.

  1. Model-Based Learning: A Synthesis of Theory and Research

    Science.gov (United States)

    Seel, Norbert M.

    2017-01-01

    This article provides a review of theoretical approaches to model-based learning and related research. In accordance with the definition of model-based learning as an acquisition and utilization of mental models by learners, the first section centers on mental model theory. In accordance with epistemology of modeling the issues of semantics,…

  2. Operational Risk Assessment of Distribution Network Equipment Based on Rough Set and D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Cunbin Li

    2013-01-01

    Full Text Available With the increasing complication, compaction, and automation of distribution network equipment, a small failure will cause an outbreak chain reaction and lead to operational risk in the power distribution system, even in the whole power system. Therefore, scientific assessment of power distribution equipment operation risk is significant to the security of power distribution system. In order to get the satisfactory assessment conclusions from the complete and incomplete information and improve the assessment level, an operational risk assessment model of distribution network equipment based on rough set and D-S evidence theory was built. In this model, the rough set theory was used to simplify and optimize the operation risk assessment indexes of distribution network equipment and the evidence D-S theory was adopted to combine the optimal indexes. At last, the equipment operational risk level was obtained from the basic probability distribution decision. Taking the transformer as an example, this paper compared the assessment result obtained from the method proposed in this paper with that from the ordinary Rogers ratio method and discussed the application of the proposed method. It proved that the method proposed in this paper is feasible, efficient, and provides a new way to assess the distribution network equipment operational risk.

  3. Combining morphometric evidence from multiple registration methods using dempster-shafer theory

    Science.gov (United States)

    Rajagopalan, Vidya; Wyatt, Christopher

    2010-03-01

    In tensor-based morphometry (TBM) group-wise differences in brain structure are measured using high degreeof- freedom registration and some form of statistical test. However, it is known that TBM results are sensitive to both the registration method and statistical test used. Given the lack of an objective model of group variation is it difficult to determine a best registration method for TBM. The use of statistical tests is also problematic given the corrections required for multiple testing and the notorius difficulty selecting and intepreting signigance values. This paper presents an approach to address both of these issues by combining multiple registration methods using Dempster-Shafer Evidence theory to produce belief maps of categorical changes between groups. This approach is applied to the comparison brain morphometry in aging, a typical application of TBM, using the determinant of the Jacobian as a measure of volume change. We show that the Dempster-Shafer combination produces a unique and easy to interpret belief map of regional changes between and within groups without the complications associated with hypothesis testing.

  4. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website.

    Science.gov (United States)

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; White, Peter; Yardley, Lucy; Bishop, Felicity L

    2016-06-10

    According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative 'think aloud' study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients' stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants' experiences of using the website. We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes

  5. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  6. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    Science.gov (United States)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  7. Dimensional reduction of Markov state models from renormalization group theory

    Science.gov (United States)

    Orioli, S.; Faccioli, P.

    2016-09-01

    Renormalization Group (RG) theory provides the theoretical framework to define rigorous effective theories, i.e., systematic low-resolution approximations of arbitrary microscopic models. Markov state models are shown to be rigorous effective theories for Molecular Dynamics (MD). Based on this fact, we use real space RG to vary the resolution of the stochastic model and define an algorithm for clustering microstates into macrostates. The result is a lower dimensional stochastic model which, by construction, provides the optimal coarse-grained Markovian representation of the system's relaxation kinetics. To illustrate and validate our theory, we analyze a number of test systems of increasing complexity, ranging from synthetic toy models to two realistic applications, built form all-atom MD simulations. The computational cost of computing the low-dimensional model remains affordable on a desktop computer even for thousands of microstates.

  8. Substandard model? At last, a good reason to opt for a sexier theory of particle physics

    CERN Multimedia

    Cho, A

    2001-01-01

    According to experimenters at Brookhaven, a tiny discrepancy in the magnetism of the muon may signal a crack in the Standard Model. The deviation could be the first piece of hard evidence for a more complete theory called supersymmetry (1 page).

  9. Optimization models using fuzzy sets and possibility theory

    CERN Document Server

    Orlovski, S

    1987-01-01

    Optimization is of central concern to a number of discip­ lines. Operations Research and Decision Theory are often consi­ dered to be identical with optimizationo But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i. e. the solutions were considered to be either fea­ sible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeller to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real prob­ lems. This is particularly true if the problem under considera­ tion includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if na­ tural language has to be...

  10. Measurement Models for Reasoned Action Theory

    OpenAIRE

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  11. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  12. Mechanical regulation of bone regeneration: theories, models, and experiments.

    Science.gov (United States)

    Betts, Duncan Colin; Müller, Ralph

    2014-01-01

    How mechanical forces influence the regeneration of bone remains an open question. Their effect has been demonstrated experimentally, which has allowed mathematical theories of mechanically driven tissue differentiation to be developed. Many simulations driven by these theories have been presented, however, validation of these models has remained difficult due to the number of independent parameters considered. An overview of these theories and models is presented along with a review of experimental studies and the factors they consider. Finally limitations of current experimental data and how this influences modeling are discussed and potential solutions are proposed.

  13. Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory

    Science.gov (United States)

    Pei, Di; Yue, Jianhai; Jiao, Jing

    2017-10-01

    This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.

  14. [Asbestos and harmful health effects: from denial theories to epidemiological evidence].

    Science.gov (United States)

    di Orio, Ferdinando; Zazzara, Francesca

    2013-01-01

    The recent episode involving Eternit, a factory in Casale Monferrato (Turin, Italy), culminated in February 2012 with a guilty verdict for the owners of the factory. The indiscriminate use of asbestos, however, continues worldwide, despite evidence of increased risk for conditions such as asbestosis and malignant pleural mesothelioma. In this study we investigate the relationship between epidemiological evidence and denial theories, over the decades and until the present time. Many countries in the world still promote the use of asbestos, with a view to profit and globalization but at the expense of public health.

  15. Posterior Predictive Model Checking for Multidimensionality in Item Response Theory

    Science.gov (United States)

    Levy, Roy; Mislevy, Robert J.; Sinharay, Sandip

    2009-01-01

    If data exhibit multidimensionality, key conditional independence assumptions of unidimensional models do not hold. The current work pursues posterior predictive model checking, a flexible family of model-checking procedures, as a tool for criticizing models due to unaccounted for dimensions in the context of item response theory. Factors…

  16. Reframing Leadership Pedagogy through Model and Theory Building.

    Science.gov (United States)

    Mello, Jeffrey A.

    1999-01-01

    Leadership theories formed the basis of a course assignment with four objectives: understanding complex factors affecting leadership dynamics, developing abilities to assess organizational factors influencing leadership, practicing model and theory building, and viewing leadership from a multicultural perspective. The assignment was to develop a…

  17. Theories of conduct disorder: a causal modelling analysis

    NARCIS (Netherlands)

    Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De

    2004-01-01

    Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –

  18. Anisotropic cosmological models in f (R, T) theory of gravitation

    Indian Academy of Sciences (India)

    Bianchi spaces are useful tools for constructing spatially homogeneous and anisotropic cosmological models in general relativity and scalar–tensor theories of gravitation. Adhav [14] obtained exact solutions of the field equations for LRS. Bianchi type-I space-time with perfect fluid in the framework of f (R, T) theory of grav-.

  19. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    Science.gov (United States)

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  1. Theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena

    Science.gov (United States)

    Jin, Yongmei M.; Wang, Yu U.; Ren, Yang

    2015-12-01

    Pre-martensitic phenomena, also called martensite precursor effects, have been known for decades while yet remain outstanding issues. This paper addresses pre-martensitic phenomena from new theoretical and experimental perspectives. A statistical mechanics-based Grüneisen-type phonon theory is developed. On the basis of deformation-dependent incompletely softened low-energy phonons, the theory predicts a lattice instability and pre-martensitic transition into elastic-phonon domains via 'phonon spinodal decomposition.' The phase transition lifts phonon degeneracy in cubic crystal and has a nature of phonon pseudo-Jahn-Teller lattice instability. The theory and notion of phonon domains consistently explain the ubiquitous pre-martensitic anomalies as natural consequences of incomplete phonon softening. The phonon domains are characterised by broken dynamic symmetry of lattice vibrations and deform through internal phonon relaxation in response to stress (a particular case of Le Chatelier's principle), leading to previously unexplored new domain phenomenon. Experimental evidence of phonon domains is obtained by in situ three-dimensional phonon diffuse scattering and Bragg reflection using high-energy synchrotron X-ray single-crystal diffraction, which observes exotic domain phenomenon fundamentally different from usual ferroelastic domain switching phenomenon. In light of the theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena, currently existing alternative opinions on martensitic precursor phenomena are revisited.

  2. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    Science.gov (United States)

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  3. Extended Nambu models: Their relation to gauge theories

    Science.gov (United States)

    Escobar, C. A.; Urrutia, L. F.

    2017-05-01

    Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.

  4. The Influence of Emotion on Fairness-Related Decision Making: A Critical Review of Theories and Evidence.

    Science.gov (United States)

    Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun

    2017-01-01

    Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.

  5. The Influence of Emotion on Fairness-Related Decision Making: A Critical Review of Theories and Evidence

    Directory of Open Access Journals (Sweden)

    Ya Zheng

    2017-09-01

    Full Text Available Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model. Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.

  6. Battery failure model derived from flaw theory

    Science.gov (United States)

    Schulman, I.

    1981-01-01

    A previously derived failure model for battery lifetime is discussed in terms of growth rate of the flaw, distribution of flaw sizes, and number of flaws. Equations are presented for determining the failure model for a nickel cadmium battery.

  7. Modelling Traffic Flows Using Graph Theory

    Directory of Open Access Journals (Sweden)

    Oksana Musyt

    2011-04-01

    Full Text Available An intensive increase in road transport, particularly individual, in recent years has led to such consequences as increased time spent on travel, the number of forced stops, traffic accidents, the occurrence of traffic jams on the road network, reducing traffic speed and a deteriorated urban road network in cities. The most effective method for solving these problems is the use of graph theory, the main characteristics of which is reliability, durability and accessibility of a free as well as loaded network. Based on their analysis the methods for network optimization are proposed.Article in Russian

  8. Models and mechanisms in gauge theories

    International Nuclear Information System (INIS)

    Polyakov, A.M.

    1979-01-01

    Several pieces of information concerning the dynamics of gauge theories are presented. Gauge fields are used for the construction of QCD and QFD. In both cases the most important question is what phases are realized if the gauge group is given. Different possibilities are known: confinement, total spontaneous breakdown, partial spontaneous breakdown and their combinations. Some unknown options also are not excluded. At the moment we have some superficial understanding of the qualitative features of different phases, but we do not know under what circumstances this or that phase is realized

  9. Development of a dynamic computational model of social cognitive theory.

    Science.gov (United States)

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  10. Modeling Reusable and Interoperable Faceted Browsing Systems with Category Theory

    OpenAIRE

    Harris, Daniel R.

    2015-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and lightweight ontol...

  11. Foundations of reusable and interoperable facet models using category theory

    OpenAIRE

    Harris, Daniel R.

    2016-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight onto...

  12. Two-matrix models and c =1 string theory

    International Nuclear Information System (INIS)

    Bonora, L.; Xiong Chuansheng

    1994-05-01

    We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)

  13. Weighted score-level feature fusion based on Dempster-Shafer evidence theory for action recognition

    Science.gov (United States)

    Zhang, Guoliang; Jia, Songmin; Li, Xiuzhi; Zhang, Xiangyin

    2018-01-01

    The majority of human action recognition methods use multifeature fusion strategy to improve the classification performance, where the contribution of different features for specific action has not been paid enough attention. We present an extendible and universal weighted score-level feature fusion method using the Dempster-Shafer (DS) evidence theory based on the pipeline of bag-of-visual-words. First, the partially distinctive samples in the training set are selected to construct the validation set. Then, local spatiotemporal features and pose features are extracted from these samples to obtain evidence information. The DS evidence theory and the proposed rule of survival of the fittest are employed to achieve evidence combination and calculate optimal weight vectors of every feature type belonging to each action class. Finally, the recognition results are deduced via the weighted summation strategy. The performance of the established recognition framework is evaluated on Penn Action dataset and a subset of the joint-annotated human metabolome database (sub-JHMDB). The experiment results demonstrate that the proposed feature fusion method can adequately exploit the complementarity among multiple features and improve upon most of the state-of-the-art algorithms on Penn Action and sub-JHMDB datasets.

  14. The monster sporadic group and a theory underlying superstring models

    International Nuclear Information System (INIS)

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs

  15. The Number of Atomic Models of Uncountable Theories

    OpenAIRE

    Ulrich, Douglas

    2016-01-01

    We show there exists a complete theory in a language of size continuum possessing a unique atomic model which is not constructible. We also show it is consistent with $ZFC + \\aleph_1 < 2^{\\aleph_0}$ that there is a complete theory in a language of size $\\aleph_1$ possessing a unique atomic model which is not constructible. Finally we show it is consistent with $ZFC + \\aleph_1 < 2^{\\aleph_0}$ that for every complete theory $T$ in a language of size $\\aleph_1$, if $T$ has uncountable atomic mod...

  16. Lattice Ising model in a field: E8 scattering theory

    NARCIS (Netherlands)

    Bazhanov, V.V.; Nienhuis, B.; Warnaar, S.O.

    1994-01-01

    Zamolodchikov found an integrable field theory related to the Lie algebra E8, which describes the scaling limit of the Ising model in a magnetic field. He conjectured that there also exist solvable lattice models based on E8 in the universality class of the Ising model in a field. The dilute A3

  17. A Dynamic Systems Theory Model of Visual Perception Development

    Science.gov (United States)

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  18. Bianchi class A models in Sàez-Ballester's theory

    Science.gov (United States)

    Socorro, J.; Espinoza-García, Abraham

    2012-08-01

    We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.

  19. Theories and Frameworks for Online Education: Seeking an Integrated Model

    Science.gov (United States)

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  20. Transformation of non-tumor host cells during tumor progression: theories and evidence.

    Science.gov (United States)

    García-Olmo, Dolores C; Picazo, María G; García-Olmo, Damián

    2012-06-01

    Most cancer deaths are due to the development of metastases and this phenomenon is still a hard challenge for researchers. A number of theories have tried to unravel the metastatic machinery, but definitive results that link the evidence with conventional concepts of metastatic disease remain to be reported. Considerable evidence suggests interactions between tumor cells and host cells that might be essential for tumor progression and metastasis. Most such evidence is suggestive of fusion phenomena, but some suggest the transfer of cell-free DNA (cfDNA). Such evidence is often ignored or overlooked in the assessment and management of malignancy. In this article, we review the available evidence for the importance of cell fusion and cfDNA in metastasis, and we present some preliminary data that support the hypothesis that tumor progression might be based not only on the division of tumor cells but also on the transformation of normal cells. Future success in the search for cancer therapies will surely require advances in our knowledge of the pathways of tumor invasion by unexpected mechanisms. Thus, no well supported evidence for roles of cell-free nucleic acids and fusion of cells or of cells with vesicles should be ignored.

  1. Measurement Models for Reasoned Action Theory.

    Science.gov (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  2. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  3. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  4. Baldrige Theory into Practice: A Generic Model

    Science.gov (United States)

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  5. Faithworthy Collaborative Spectrum Sensing Based on Credibility and Evidence Theory for Cognitive Radio Networks

    Directory of Open Access Journals (Sweden)

    Fang Ye

    2017-03-01

    Full Text Available Cognitive radio (CR has become a tempting technology that achieves significant improvement in spectrum utilization. To resolve the hidden terminal problem, collaborative spectrum sensing (CSS, which profits from spatial diversity, has been studied intensively in recent years. As CSS is vulnerable to the attacks launched by malicious secondary users (SUs, certain CSS security schemes based on the Dempster–Shafer theory of evidence have been proposed. Nevertheless, the available works only focus on the real-time difference of SUs, like the difference in similarity degree or SNR, to evaluate the credibility of each SU. Since the real-time difference is unilateral and sometimes inexact, the statistical information comprised in SUs’ historical behaviors should not be ignored. In this paper, we propose a robust CSS method based on evidence theory and credibility calculation. It is executed in four consecutive procedures, which are basic probability assignment (BPA, holistic credibility calculation, option and amelioration of BPA and evidence combination via the Dempster–Shafer rule, respectively. Our scheme evaluates the holistic credibility of SUs from both the real-time difference and statistical sensing behavior of SUs. Moreover, considering that the transmitted data increase with the number of SUs increasing, we introduce the projection approximation approach to adjust the evidence theory to the binary hypothesis test in CSS; on this account, both the data volume to be transmitted and the workload at the data fusion center have been reduced. Malicious SUs can be distinguished from genuine ones based on their historical sensing behaviors, and SUs’ real-time difference can be reserved to acquire a superior current performance. Abounding simulation results have proven that the proposed method outperforms the existing ones under the effect of different attack modes and different numbers of malicious SUs.

  6. Optimal transportation networks models and theory

    CERN Document Server

    Bernot, Marc; Morel, Jean-Michel

    2009-01-01

    The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.

  7. System Identification Theory Approach to Cohesive Sediment Transport Modelling

    OpenAIRE

    CHEN, HUIXIN

    1997-01-01

    Two aspects of the modelling sediment transport are investigated. One is the univariate time series modelling the current velocity dynamics. The other is the multivariate time series modelling the suspended sediment concentration dynamics. Cohesive sediment dynamics and numerical sediment transport model are reviewed and investigated. The system identification theory and time series analysis method are developed and applied to set up the time series model for current velocity a...

  8. Testing the theory of emissions trading : Experimental evidence on alternative mechanisms for global carbon trading

    NARCIS (Netherlands)

    Klaassen, Ger; Nentjes, Andries; Smith, Mark

    2005-01-01

    Simulation models and theory prove that emission trading converges to market equilibrium. This paper sets out to test these results using experimental economics. Three experiments are conducted for the six largest carbon emitting industrialized regions. Two experiments use auctions, the first a

  9. Interpreting drinking water quality in the distribution system using Dempster-Shafer theory of evidence.

    Science.gov (United States)

    Sadiq, Rehan; Rodriguez, Manuel J

    2005-04-01

    Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.

  10. Mixed models theory and applications with R

    CERN Document Server

    Demidenko, Eugene

    2013-01-01

    Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g

  11. Twinlike models in scalar field theories

    International Nuclear Information System (INIS)

    Bazeia, D.; Losano, L.; Dantas, J. D.; Gomes, A. R.; Menezes, R.

    2011-01-01

    This work deals with the presence of defect structures in models described by a real scalar field in a diversity of scenarios. The defect structures that we consider are static solutions of the equations of motion that depend on a single spatial dimension. We search for different models, which support the same defect solution, with the very same energy density. We work in flat spacetime, where we introduce and investigate a new class of models. We also work in curved spacetime, within the braneworld context, with a single extra dimension of infinite extent, and there we show how the brane is formed from the static field configuration.

  12. Solid mechanics theory, modeling, and problems

    CERN Document Server

    Bertram, Albrecht

    2015-01-01

    This textbook offers an introduction to modeling the mechanical behavior of solids within continuum mechanics and thermodynamics. To illustrate the fundamental principles, the book starts with an overview of the most important models in one dimension. Tensor calculus, which is called for in three-dimensional modeling, is concisely presented in the second part of the book. Once the reader is equipped with these essential mathematical tools, the third part of the book develops the foundations of continuum mechanics right from the beginning. Lastly, the book’s fourth part focuses on modeling the mechanics of materials and in particular elasticity, viscoelasticity and plasticity. Intended as an introductory textbook for students and for professionals interested in self-study, it also features numerous worked-out examples to aid in understanding.

  13. Transforming Patient-Centered Care: Development of the Evidence Informed Decision Making through Engagement Model.

    Science.gov (United States)

    Moore, Jennifer E; Titler, Marita G; Kane Low, Lisa; Dalton, Vanessa K; Sampselle, Carolyn M

    2015-01-01

    In response to the passage of the Affordable Care Act in the United States, clinicians and researchers are critically evaluating methods to engage patients in implementing evidence-based care to improve health outcomes. However, most models on implementation only target clinicians or health systems as the adopters of evidence. Patients are largely ignored in these models. A new implementation model that captures the complex but important role of patients in the uptake of evidence may be a critical missing link. Through a process of theory evaluation and development, we explore patient-centered concepts (patient activation and shared decision making) within an implementation model by mapping qualitative data from an elective induction of labor study to assess the model's ability to capture these key concepts. The process demonstrated that a new, patient-centered model for implementation is needed. In response, the Evidence Informed Decision Making through Engagement Model is presented. We conclude that, by fully integrating women into an implementation model, outcomes that are important to both the clinician and patient will improve. In the interest of providing evidence-based care to women during pregnancy and childbirth, it is essential that care is patient centered. The inclusion of concepts discussed in this article has the potential to extend beyond maternity care and influence other clinical areas. Utilizing the newly developed Evidence Informed Decision Making through Engagement Model provides a framework for utilizing evidence and translating it into practice while acknowledging the important role that women have in the process. Published by Elsevier Inc.

  14. Modeling workplace bullying using catastrophe theory.

    Science.gov (United States)

    Escartin, J; Ceja, L; Navarro, J; Zapf, D

    2013-10-01

    Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.

  15. Spatial interaction models facility location using game theory

    CERN Document Server

    D'Amato, Egidio; Pardalos, Panos

    2017-01-01

    Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.

  16. An Ar threesome: Matrix models, 2d conformal field theories, and 4d N =2 gauge theories

    Science.gov (United States)

    Schiappa, Ricardo; Wyllard, Niclas

    2010-08-01

    We explore the connections between three classes of theories: Ar quiver matrix models, d =2 conformal Ar Toda field theories, and d =4 N =2 supersymmetric conformal Ar quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N =2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.

  17. An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories

    International Nuclear Information System (INIS)

    Schiappa, Ricardo; Wyllard, Niclas

    2010-01-01

    We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.

  18. Density functional theory and multiscale materials modeling

    Indian Academy of Sciences (India)

    One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids.

  19. The oxidative stress theory of disease: levels of evidence and epistemological aspects.

    Science.gov (United States)

    Ghezzi, Pietro; Jaquet, Vincent; Marcucci, Fabrizio; Schmidt, Harald H H W

    2017-06-01

    The theory that oxidative stress (OS) is at the root of several diseases is extremely popular. However, so far, no antioxidant has been recommended or offered by healthcare systems neither has any been approved as therapy by regulatory agencies that base their decisions on evidence-based medicine. This is simply because, so far, despite many preclinical and clinical studies indicating a beneficial effect of antioxidants in many disease conditions, randomised clinical trials have failed to provide the evidence of efficacy required for drug approval. In this review, we discuss the levels of evidence required to claim causality in preclinical research on OS, the weakness of the oversimplification associated with OS theory of disease and the importance of the narrative in its popularity. Finally, from a more translational perspective, we discuss the reasons why antioxidants acting by scavenging ROS might not only prevent their detrimental effects but also interfere with essential signalling roles. We propose that ROS have a complex metabolism and are generated by different enzymes at diverse sites and at different times. Aggregating this plurality of systems into a single theory of disease may not be the best way to develop new drugs, and future research may need to focus on specific oxygen-toxifying pathways rather than on non-specific ROS scavengers. Finally, similarly to what is nowadays required for clinical trials, we recommend making unpublished data available in repositories (open data), as this will allow big data approaches or meta-analyses, without the drawbacks of publication bias. This article is part of a themed section on Redox Biology and Oxidative Stress in Health and Disease. To view the other articles in this section visit http://onlinelibrary.wiley.com/doi/10.1111/bph.v174.12/issuetoc. © 2016 The British Pharmacological Society.

  20. Qualitative model-based diagnosis using possibility theory

    Science.gov (United States)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  1. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  2. Lenses on reading an introduction to theories and models

    CERN Document Server

    Tracey, Diane H

    2017-01-01

    Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a

  3. A Reflection on Research, Theory, Evidence-based Practice, and Quality Improvement

    Directory of Open Access Journals (Sweden)

    Eesa Mohammadi

    2016-04-01

    While each process is associated with its unique characteristics, overlaps are likely to appear between each of the two processes. For instance, in the EBP process, if one discovers (theory that evidence is inadequate to implement a certain intervention, it highlights the need for research on that specific subject. Similarly, QI may lead to the identification of new questions, which could be used for research purposes. All the discussed processes, as well as their scientific and professional dimensions, are essential to nursing disciplines in healthcare systems.

  4. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  5. Fuzzy Stochastic Optimization Theory, Models and Applications

    CERN Document Server

    Wang, Shuming

    2012-01-01

    Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies.   The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...

  6. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars

    2017-01-01

    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  7. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  8. On Theories and Models in Fuzzy Predicate Logics

    Czech Academy of Sciences Publication Activity Database

    Hájek, Petr; Cintula, Petr

    2006-01-01

    Roč. 71, č. 3 (2006), s. 863-880 ISSN 0022-4812 R&D Projects: GA AV ČR IAA100300503 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * model theory * witnessed models * conservative extension * completeness theorem Subject RIV: BA - General Mathematics Impact factor: 0.664, year: 2006

  9. Integrable lattice models, graphs and modular invariant conformal field theories

    International Nuclear Information System (INIS)

    Francesco, P.

    1992-01-01

    This paper reviews the construction of integrable height models attached to graphs in connection with compact Lie groups. The continuum limit of these models yields conformally invariant field theories. A direct relation between graphs and (Kac-Moody or coset) modular invariants is proposed

  10. Theory and Practice: An Integrative Model Linking Class and Field

    Science.gov (United States)

    Lesser, Joan Granucci; Cooper, Marlene

    2006-01-01

    Social work has evolved over the years taking on the challenges of the times. The profession now espouses a breadth of theoretical approaches and treatment modalities. We have developed a model to help graduate social work students master the skill of integrating theory and social work practice. The Integrative Model has five components: (l) The…

  11. Turbulent Boundary Layers - Experiments, Theory and Modelling

    Science.gov (United States)

    1980-01-01

    1979 "Calcul des transferts thermiques entre film chaud et substrat par un modele ä deux dimensions", Int. J. Heat Mass Transfer ^2, p. 111-119...surface heat transfer a to the surface shear Cu/ ; here, corrections are compulsory because the wall shear,stress fluctuations are large (the r.m.s...technique is the mass transfer analogue of the constant temperature anemometer when the chemical reaction at the electrode embedded in the wall is

  12. Modeling Workplace Bullying Behaviors Using Catastrophe Theory

    OpenAIRE

    Escartín Solanelles, Jordi; Ceja, Lucía; Navarro Cid, José; Zapf, D.

    2013-01-01

    Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the prese...

  13. Clinical outcome measurement: Models, theory, psychometrics and practice.

    Science.gov (United States)

    McClimans, Leah; Browne, John; Cano, Stefan

    In the last decade much has been made of the role that models play in the epistemology of measurement. Specifically, philosophers have been interested in the role of models in producing measurement outcomes. This discussion has proceeded largely within the context of the physical sciences, with notable exceptions considering measurement in economics. However, models also play a central role in the methods used to develop instruments that purport to quantify psychological phenomena. These methods fall under the umbrella term 'psychometrics'. In this paper, we focus on Clinical Outcome Assessments (COAs) and discuss two measurement theories and their associated models: Classical Test Theory (CTT) and Rasch Measurement Theory. We argue that models have an important role to play in coordinating theoretical terms with empirical content, but to do so they must serve: 1) as a representation of the measurement interaction; and 2) in conjunction with a theory of the attribute in which we are interested. We conclude that Rasch Measurement Theory is a more promising approach than CTT in these regards despite the latter's popularity with health outcomes researchers. Copyright © 2017. Published by Elsevier Ltd.

  14. Contemporary Cognitive Behavior Therapy: A Review of Theory, History, and Evidence.

    Science.gov (United States)

    Thoma, Nathan; Pilecki, Brian; McKay, Dean

    2015-09-01

    Cognitive behavior therapy (CBT) has come to be a widely practiced psychotherapy throughout the world. The present article reviews theory, history, and evidence for CBT. It is meant as an effort to summarize the forms and scope of CBT to date for the uninitiated. Elements of CBT such as cognitive therapy, behavior therapy, and so-called "third wave" CBT, such as dialectical behavior therapy (DBT) and acceptance and commitment therapy (ACT) are covered. The evidence for the efficacy of CBT for various disorders is reviewed, including depression, anxiety disorders, personality disorders, eating disorders, substance abuse, schizophrenia, chronic pain, insomnia, and child/adolescent disorders. The relative efficacy of medication and CBT, or their combination, is also briefly considered. Future directions for research and treatment development are proposed.

  15. The Role of Adolescent Development in Social Networking Site Use: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Drew P. Cingel

    2014-03-01

    Full Text Available Using survey data collected from 260 children, adolescents, and young adults between the ages of 9 and 26, this paper offers evidence for a relationship between social networking site use and Imaginary Audience, a developmental variable in which adolescents believe others are thinking about them at all times. Specifically, after controlling for a number of variables, results indicate a significant, positive relationship between social networking site use and Imaginary Audience ideation. Additionally, results indicate a positive relationship between Imaginary Audience ideation and Facebook customization practices. Together, these findings provide evidence, based on Vygotskian developmental theory, for a general consideration of the role that currently available tools, in this case social networking sites, can have on development. Thus, findings implicate both the role of development on social networking site use, as well as the role of social networking site use on development. Overall, these findings have important implications for the study of media and human development, which are discussed in detail.

  16. A review of the evidence regarding associations between attachment theory and experimentally induced pain.

    Science.gov (United States)

    Meredith, Pamela Joy

    2013-04-01

    Theoretical and empirical evidence suggests that adult attachment and pain-related variables are predictably and consistently linked, and that understanding these links may guide pain intervention and prevention efforts. In general, insecure attachment has been portrayed as a risk factor, and secure attachment as a protective factor, for people with chronic pain conditions. In an effort to better understand the relationships among attachment and pain variables, these links have been investigated in pain-free samples using induced-pain techniques. The present paper reviews the available research linking adult attachment and laboratory-induced pain. While the diverse nature of the studies precludes definitive conclusions, together these papers offer support for associations between insecure attachment and a more negative pain experience. The evidence presented in this review highlights areas for further empirical attention, as well as providing some guidance for clinicians who may wish to employ preventive approaches and other interventions informed by attachment theory.

  17. Optimal velocity difference model for a car-following theory

    International Nuclear Information System (INIS)

    Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.

    2011-01-01

    In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.

  18. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ingale, S. V.; Datta, D.

    2010-01-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  19. Genetic model compensation: Theory and applications

    Science.gov (United States)

    Cruickshank, David Raymond

    1998-12-01

    The adaptive filtering algorithm known as Genetic Model Compensation (GMC) was originally presented in the author's Master's Thesis. The current work extends this earlier work. GMC uses a genetic algorithm to optimize filter process noise parameters in parallel with the estimation of the state and based only on the observational information available to the filter. The original stochastic state model underlying GMC was inherited from the antecedent, non-adaptive Dynamic Model Compensation (DMC) algorithm. The current work develops the stochastic state model from a linear system viewpoint, avoiding the simplifications and approximations of the earlier development, and establishes Riemann sums as unbiased estimators of the stochastic integrals which describe the evolution of the random state components. These are significant developments which provide GMC with a solid theoretical foundation. Orbit determination is the area of application in this work, and two types of problems are studied: real-time autonomous filtering using absolute GPS measurements and precise post-processed filtering using differential GPS measurements. The first type is studied in a satellite navigation simulation in which pseudorange and pseudorange rate measurements are processed by an Extended Kalman Filter which incorporates both DMC and GMC. Both estimators are initialized by a geometric point solution algorithm. Using measurements corrupted by simulated Selective Availability errors, GMC reduces mean RSS position error by 6.4 percent, reduces mean clock bias error by 46 percent, and displays a marked improvement in covariance consistency relative to DMC. To study the second type of problem, GMC is integrated with NASA Jet Propulsion Laboratory's Gipsy/Oasis-II (GOA-II) precision orbit determination program creating an adaptive version of GOA-II's Reduced Dynamic Tracking (RDT) process noise formulation. When run as a sequential estimator with GPS measurements from the TOPEX satellite and

  20. Integrating Developmental Theory and Methodology: Using Derivatives to Articulate Change Theories, Models, and Inferences

    Science.gov (United States)

    Deboeck, Pascal R.; Nicholson, Jody; Kouros, Chrystyna; Little, Todd D.; Garber, Judy

    2015-01-01

    Matching theories about growth, development, and change to appropriate statistical models can present a challenge, which can result in misuse, misinterpretation, and underutilization of different analytical approaches. We discuss the use of "derivatives": the change of a construct with respect to the change in another construct.…

  1. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  2. Brief Strategic Family Therapy: Implementing evidence-based models in community settings

    OpenAIRE

    Szapocznik, José; Muir, Joan A.; Duff, Johnathan H.; Schwartz, Seth J.; Brown, C. Hendricks

    2013-01-01

    Reflecting a nearly 40-year collaborative partnership between clinical researchers and clinicians, the present article reviews the authors’ experience in developing, investigating, and implementing the Brief Strategic Family Therapy (BSFT) model. The first section of the article focuses on the theory, practice, and studies related to this evidence-based family therapy intervention targeting adolescent drug abuse and delinquency. The second section focuses on the implementation model created f...

  3. M-theory model-building and proton stability

    International Nuclear Information System (INIS)

    Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens

    1997-09-01

    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory

  4. M-Theory Model-Building and Proton Stability

    CERN Document Server

    Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.

    1998-01-01

    We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  5. Localization landscape theory of disorder in semiconductors I: Theory and modeling

    OpenAIRE

    Filoche, Marcel; Piccardo, Marco; Wu, Yuh-Renn; Li, Chi-Kang; Weisbuch, Claude; Mayboroda, Svitlana

    2017-01-01

    We present here a model of carrier distribution and transport in semiconductor alloys accounting for quantum localization effects in disordered materials. This model is based on the recent development of a mathematical theory of quantum localization which introduces for each type of carrier a spatial function called \\emph{localization landscape}. These landscapes allow us to predict the localization regions of electron and hole quantum states, their corresponding energies, and the local densi...

  6. Theory to practice: the humanbecoming leading-following model.

    Science.gov (United States)

    Ursel, Karen L

    2015-01-01

    Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.

  7. An Abstraction Theory for Qualitative Models of Biological Systems

    Directory of Open Access Journals (Sweden)

    Richard Banks

    2010-10-01

    Full Text Available Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well-known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automation of this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis-lysogeny switch in the bacteriophage lambda.

  8. Developing Theory to Guide Building Practitioners' Capacity to Implement Evidence-Based Interventions.

    Science.gov (United States)

    Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C; Escoffery, Cam T; Herrmann, Alison K; Thatcher, Esther; Hartman, Marieke A; Fernandez, Maria E

    2017-02-01

    Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners' capacity to adopt and implement a variety of EBIs across diverse practice contexts.

  9. Further evidence of alerted default network connectivity and association with theory of mind ability in schizophrenia.

    Science.gov (United States)

    Mothersill, Omar; Tangney, Noreen; Morris, Derek W; McCarthy, Hazel; Frodl, Thomas; Gill, Michael; Corvin, Aiden; Donohoe, Gary

    2017-06-01

    Resting-state functional magnetic resonance imaging (rs-fMRI) has repeatedly shown evidence of altered functional connectivity of large-scale networks in schizophrenia. The relationship between these connectivity changes and behaviour (e.g. symptoms, neuropsychological performance) remains unclear. Functional connectivity in 27 patients with schizophrenia or schizoaffective disorder, and 25 age and gender matched healthy controls was examined using rs-fMRI. Based on seed regions from previous studies, we examined functional connectivity of the default, cognitive control, affective and attention networks. Effects of symptom severity and theory of mind performance on functional connectivity were also examined. Patients showed increased connectivity between key nodes of the default network including the precuneus and medial prefrontal cortex compared to controls (pdefault regions within the patient group (pdefault hyper-connectivity in schizophrenia spectrum patients and reveals an association between altered default connectivity and positive symptom severity. As a novel find, this study also shows that default connectivity is correlated to and predictive of theory of mind performance. Extending these findings by examining the effects of emerging social cognition treatments on both default connectivity and theory of mind performance is now an important goal for research. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Models with oscillator terms in noncommutative quantum field theory

    International Nuclear Information System (INIS)

    Kronberger, E.

    2010-01-01

    The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de

  11. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  12. The Quantity Theory of Money and Its Long Run Implications: Empirical Evidence from Nigeria

    OpenAIRE

    Alimi, R. Santos

    2012-01-01

    The Quantity Theory of Money (QTM) is one of the popular classical macroeconomic models that explain the relationship between the quantity of money in an economy and the level of prices of goods and services. This study investigates this relationship for Nigeria economy over the period of 1960 to 2009. To check the stationarity properties, we employed Augmented Dickey Fuller (ADF) and Phillips-Perron (PP) test and found all the concerned variables are stationary only in the first differenced ...

  13. Consistent constraints on the Standard Model Effective Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Berthier, Laure; Trott, Michael [Niels Bohr International Academy, University of Copenhagen,Blegdamsvej 17, DK-2100 Copenhagen (Denmark)

    2016-02-10

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.

  14. Effective potential in Lorentz-breaking field theory models

    Energy Technology Data Exchange (ETDEWEB)

    Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)

    2017-12-15

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  15. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H

    2012-01-01

    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  16. Evidence-based selection of theories for designing behaviour change interventions: using methods based on theoretical construct domains to understand clinicians' blood transfusion behaviour.

    Science.gov (United States)

    Francis, Jill J; Stockton, Charlotte; Eccles, Martin P; Johnston, Marie; Cuthbertson, Brian H; Grimshaw, Jeremy M; Hyde, Chris; Tinmouth, Alan; Stanworth, Simon J

    2009-11-01

    Many theories of behaviour are potentially relevant to predictive and intervention studies but most studies investigate a narrow range of theories. Michie et al. (2005) agreed 12 'theoretical domains' from 33 theories that explain behaviour change. They developed a 'Theoretical Domains Interview' (TDI) for identifying relevant domains for specific clinical behaviours, but the framework has not been used for selecting theories for predictive studies. It was used here to investigate clinicians' transfusion behaviour in intensive care units (ICU). Evidence suggests that red blood cells transfusion could be reduced for some patients without reducing quality of care. (1) To identify the domains relevant to transfusion practice in ICUs and neonatal intensive care units (NICUs), using the TDI. (2) To use the identified domains to select appropriate theories for a study predicting transfusion behaviour. An adapted TDI about managing a patient with borderline haemoglobin by watching and waiting instead of transfusing red blood cells was used to conduct semi-structured, one-to-one interviews with 18 intensive care consultants and neonatologists across the UK. Relevant theoretical domains were: knowledge, beliefs about capabilities, beliefs about consequences, social influences, behavioural regulation. Further analysis at the construct level resulted in selection of seven theoretical approaches relevant to this context: Knowledge-Attitude-Behaviour Model, Theory of Planned Behaviour, Social Cognitive Theory, Operant Learning Theory, Control Theory, Normative Model of Work Team Effectiveness and Action Planning Approaches. This study illustrated, the use of the TDI to identify relevant domains in a complex area of inpatient care. This approach is potentially valuable for selecting theories relevant to predictive studies and resulted in greater breadth of potential explanations than would be achieved if a single theoretical model had been adopted.

  17. A signal detection-item response theory model for evaluating neuropsychological measures.

    Science.gov (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G

    2018-02-05

    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  18. Social cognition in anorexia nervosa: evidence of preserved theory of mind and impaired emotional functioning.

    Directory of Open Access Journals (Sweden)

    Mauro Adenzato

    Full Text Available BACKGROUND: The findings of the few studies that have to date investigated the way in which individuals with Anorexia Nervosa (AN navigate their social environment are somewhat contradictory. We undertook this study to shed new light on the social-cognitive profile of patients with AN, analysing Theory of Mind and emotional functioning. Starting from previous evidence on the role of the amygdala in the neurobiology of AN and in the social cognition, we hypothesise preserved Theory of Mind and impaired emotional functioning in patients with AN. METHODOLOGY: Thirty women diagnosed with AN and thirty-two women matched for education and age were involved in the study. Theory of Mind and emotional functioning were assessed with a set of validated experimental tasks. A measure of perceived social support was also used to test the correlations between this dimension and the social-cognitive profile of AN patients. PRINCIPAL FINDINGS: The performance of patients with AN is significantly worse than that of healthy controls on tasks assessing emotional functioning, whereas patients' performance is comparable to that of healthy controls on the Theory of Mind task. Correlation analyses showed no relationship between scores on any of the social-cognition tasks and either age of onset or duration of illness. A correlation between social support and emotional functioning was found. This latter result seems to suggest a potential role of social support in the treatment and recovery of AN. CONCLUSIONS: The pattern of results followed the experimental hypothesis. They may be useful to help us better understand the social-cognitive profile of patients with AN and to contribute to the development of effective interventions based on the ways in which patients with AN actually perceive their social environment.

  19. A Model of PCF in Guarded Type Theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  20. Integrable models in 1+1 dimensional quantum field theory

    International Nuclear Information System (INIS)

    Faddeev, Ludvig.

    1982-09-01

    The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR

  1. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  2. Traffic Games: Modeling Freeway Traffic with Game Theory.

    Science.gov (United States)

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  3. Comparison of potential models through heavy quark effective theory

    International Nuclear Information System (INIS)

    Amundson, J.F.

    1995-01-01

    I calculate heavy-light decay constants in a nonrelativistic potential model. The resulting estimate of heavy quark symmetry breaking conflicts with similar estimates from lattice QCD. I show that a semirelativistic potential model eliminates the conflict. Using the results of heavy quark effective theory allows me to identify and compensate for shortcomings in the model calculations in addition to isolating the source of the differences in the two models. The results lead to a rule as to where the nonrelativistic quark model gives misleading predictions

  4. Delivering patient decision aids on the Internet: definitions, theories, current evidence, and emerging research areas

    Science.gov (United States)

    2013-01-01

    Background In 2005, the International Patient Decision Aids Standards Collaboration identified twelve quality dimensions to guide assessment of patient decision aids. One dimension—the delivery of patient decision aids on the Internet—is relevant when the Internet is used to provide some or all components of a patient decision aid. Building on the original background chapter, this paper provides an updated definition for this dimension, outlines a theoretical rationale, describes current evidence, and discusses emerging research areas. Methods An international, multidisciplinary panel of authors examined the relevant theoretical literature and empirical evidence through 2012. Results The updated definition distinguishes Internet-delivery of patient decision aids from online health information and clinical practice guidelines. Theories in cognitive psychology, decision psychology, communication, and education support the value of Internet features for providing interactive information and deliberative support. Dissemination and implementation theories support Internet-delivery for providing the right information (rapidly updated), to the right person (tailored), at the right time (the appropriate point in the decision making process). Additional efforts are needed to integrate the theoretical rationale and empirical evidence from health technology perspectives, such as consumer health informatics, user experience design, and human-computer interaction. Despite Internet usage ranging from 74% to 85% in developed countries and 80% of users searching for health information, it is unknown how many individuals specifically seek patient decision aids on the Internet. Among the 86 randomized controlled trials in the 2011 Cochrane Collaboration’s review of patient decision aids, only four studies focused on Internet-delivery. Given the limited number of published studies, this paper particularly focused on identifying gaps in the empirical evidence base and

  5. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    Science.gov (United States)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  6. Poisson-Boltzmann theory of charged colloids: limits of the cell model for salty suspensions

    International Nuclear Information System (INIS)

    Denton, A R

    2010-01-01

    Thermodynamic properties of charge-stabilized colloidal suspensions and polyelectrolyte solutions are commonly modelled by implementing the mean-field Poisson-Boltzmann (PB) theory within a cell model. This approach models a bulk system by a single macroion, together with counterions and salt ions, confined to a symmetrically shaped, electroneutral cell. While easing numerical solution of the nonlinear PB equation, the cell model neglects microion-induced interactions and correlations between macroions, precluding modelling of macroion ordering phenomena. An alternative approach, which avoids the artificial constraints of cell geometry, exploits the mapping of a macroion-microion mixture onto a one-component model of pseudo-macroions governed by effective interparticle interactions. In practice, effective-interaction models are usually based on linear-screening approximations, which can accurately describe strong nonlinear screening only by incorporating an effective (renormalized) macroion charge. Combining charge renormalization and linearized PB theories, in both the cell model and an effective-interaction (cell-free) model, we compute osmotic pressures of highly charged colloids and monovalent microions, in Donnan equilibrium with a salt reservoir, over a range of concentrations. By comparing predictions with primitive model simulation data for salt-free suspensions, and with predictions from nonlinear PB theory for salty suspensions, we chart the limits of both the cell model and linear-screening approximations in modelling bulk thermodynamic properties. Up to moderately strong electrostatic couplings, the cell model proves accurate for predicting osmotic pressures of deionized (counterion-dominated) suspensions. With increasing salt concentration, however, the relative contribution of macroion interactions to the osmotic pressure grows, leading predictions from the cell and effective-interaction models to deviate. No evidence is found for a liquid

  7. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai

    2014-05-01

    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  8. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-01-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  9. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  10. Theory, modeling, and integrated studies in the Arase (ERG) project

    Science.gov (United States)

    Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa

    2018-02-01

    Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.

  11. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  12. The early years of string theory: The dual resonance model

    International Nuclear Information System (INIS)

    Ramond, P.

    1987-10-01

    This paper reviews the past quantum mechanical history of the dual resonance model which is an early string theory. The content of this paper is listed as follows: historical review, the Veneziano amplitude, the operator formalism, the ghost story, and the string story

  13. Medical Specialty Decision Model: Utilizing Social Cognitive Career Theory

    Science.gov (United States)

    Gibson, Denise D.; Borges, Nicole J.

    2004-01-01

    Objectives: The purpose of this study was to develop a working model to explain medical specialty decision-making. Using Social Cognitive Career Theory, we examined personality, medical specialty preferences, job satisfaction, and expectations about specialty choice to create a conceptual framework to guide specialty choice decision-making.…

  14. Using Conceptual Change Theories to Model Position Concepts in Astronomy

    Science.gov (United States)

    Yang, Chih-Chiang; Hung, Jeng-Fung

    2012-01-01

    The roles of conceptual change and model building in science education are very important and have a profound and wide effect on teaching science. This study examines the change in children's position concepts after instruction, based on different conceptual change theories. Three classes were chosen and divided into three groups, including a…

  15. SIMP model at NNLO in chiral perturbation theory

    DEFF Research Database (Denmark)

    Hansen, Martin Rasmus Lundquist; Langaeble, K.; Sannino, F.

    2015-01-01

    We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 to 2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles...

  16. Using SAS PROC MCMC for Item Response Theory Models

    Science.gov (United States)

    Ames, Allison J.; Samonte, Kelli

    2015-01-01

    Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…

  17. Multilevel Higher-Order Item Response Theory Models

    Science.gov (United States)

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  18. An NCME Instructional Module on Polytomous Item Response Theory Models

    Science.gov (United States)

    Penfield, Randall David

    2014-01-01

    A polytomous item is one for which the responses are scored according to three or more categories. Given the increasing use of polytomous items in assessment practices, item response theory (IRT) models specialized for polytomous items are becoming increasingly common. The purpose of this ITEMS module is to provide an accessible overview of…

  19. Item Response Theory Models for Performance Decline during Testing

    Science.gov (United States)

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…

  20. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...

  1. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been studied and some assumptions among the ...

  2. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  3. Two-dimensional models in statistical mechanics and field theory

    International Nuclear Information System (INIS)

    Koberle, R.

    1980-01-01

    Several features of two-dimensional models in statistical mechanics and Field theory, such as, lattice quantum chromodynamics, Z(N), Gross-Neveu and CP N-1 are discussed. The problems of confinement and dynamical mass generation are also analyzed. (L.C.) [pt

  4. A Proposed Model of Jazz Theory Knowledge Acquisition

    Science.gov (United States)

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  5. Dimensions of Genocide: The Circumplex Model Meets Violentization Theory

    Science.gov (United States)

    Winton, Mark A.

    2008-01-01

    The purpose of this study is to examine the use of Olson's (1995, 2000) family therapy based circumplex model and Athens' (1992, 1997, 2003) violentization theory in explaining genocide. The Rwandan genocide of 1994 is used as a case study. Published texts, including interviews with perpetrators, research reports, human rights reports, and court…

  6. Conceptualizations of Creativity: Comparing Theories and Models of Giftedness

    Science.gov (United States)

    Miller, Angie L.

    2012-01-01

    This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…

  7. Symmetry-guided large-scale shell-model theory

    Czech Academy of Sciences Publication Activity Database

    Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.

    2016-01-01

    Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell-model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016

  8. Excellence in Physics Education Award: Modeling Theory for Physics Instruction

    Science.gov (United States)

    Hestenes, David

    2014-03-01

    All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.

  9. Cognitive models of choice: comparing decision field theory to the proportional difference model.

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jörg; González-Vallejo, Claudia

    2009-07-01

    People often face preferential decisions under risk. To further our understanding of the cognitive processes underlying these preferential choices, two prominent cognitive models, decision field theory (DFT; Busemeyer & Townsend, 1993) and the proportional difference model (PD; González-Vallejo, 2002), were rigorously tested against each other. In two consecutive experiments, the participants repeatedly had to choose between monetary gambles. The first experiment provided the reference to estimate the models' free parameters. From these estimations, new gamble pairs were generated for the second experiment such that the two models made maximally divergent predictions. In the first experiment, both models explained the data equally well. However, in the second generalization experiment, the participants' choices were much closer to the predictions of DFT. The results indicate that the stochastic process assumed by DFT, in which evidence in favor of or against each option accumulates over time, described people's choice behavior better than the trade-offs between proportional differences assumed by PD. Copyright © 2009 Cognitive Science Society, Inc.

  10. Measuring Convergence using Dynamic Equilibrium Models: Evidence from Chinese Provinces

    DEFF Research Database (Denmark)

    Pan, Lei; Posch, Olaf; van der Wel, Michel

    We propose a model to study economic convergence in the tradition of neoclassical growth theory. We employ a novel stochastic set-up of the Solow (1956) model with shocks to both capital and labor. Our novel approach identifies the speed of convergence directly from estimating the parameters whic...

  11. A computer model for a theory of evolution.

    Science.gov (United States)

    Bocci, Cristiano; Freguglia, Paolo; Rogora, Enrico

    2010-01-01

    Computer models and computer simulations are crucial for understanding complex phenomena because they compel the explicit enumeration of all variables and the exact specification of all relations between them. In this paper we discuss a computer model for a phenotypical theory of evolution which, in our opinion, is well suited to simulate the complex dependence of speciation on both internal and external factors, through their influences on the fertility factor. Some of these dependences are investigated through simulations.

  12. Perturbation theory around the Wess-Zumino-Witten model

    International Nuclear Information System (INIS)

    Hasseln, H. v.

    1991-05-01

    We consider a perturbation of the Wess-Zumino-Witten model in 2D by a current-current interaction. The β-function is computed to third order in the coupling constant and a nontrivial fixedpoint is found. By non-abelian bosonization, this perturbed WZW-model is shown to have the same β-function (at least to order g 2 ) as the fermionic theory with a four-fermion interaction. (orig.) [de

  13. Evidence for the multiverse in the standard model and beyond

    Science.gov (United States)

    Hall, Lawrence J.; Nomura, Yasunori

    2008-08-01

    In any theory it is unnatural if the observed values of parameters lie very close to special values that determine the existence of complex structures necessary for observers. A naturalness probability P is introduced to numerically evaluate the degree of unnaturalness. If P is very small in all known theories, corresponding to a high degree of fine-tuning, then there is an observer naturalness problem. In addition to the well-known case of the cosmological constant, we argue that nuclear stability and electroweak symmetry breaking represent significant observer naturalness problems. The naturalness probability associated with nuclear stability depends on the theory of flavor, but for all known theories is conservatively estimated as Pnuc≲(10-3 10-2), and for simple theories of electroweak symmetry breaking PEWSB≲(10-2 10-1). This pattern of unnaturalness in three different arenas, cosmology, nuclear physics, and electroweak symmetry breaking, provides evidence for the multiverse, since each problem may be easily solved by environmental selection. In the nuclear case the problem is largely solved even if the multiverse distribution for the relevant parameters is relatively flat. With somewhat strongly varying distributions, it is possible to understand both the close proximity to neutron stability and the values of me and md-mu in terms of the electromagnetic mass difference between the proton and neutron, δEM≃1±0.5MeV. It is reasonable that multiverse distributions are strong functions of Lagrangian parameters, since they depend not only on the landscape of vacua, but also on the population mechanism, “integrating out” other parameters, and on a density of observers factor. In any theory with mass scale M that is the origin of electroweak symmetry breaking, strongly varying multiverse distributions typically lead either to a little hierarchy v/M≈(10-2 10-1), or to a large hierarchy v≪M. In certain multiverses, where electroweak symmetry breaking

  14. Evidence for the multiple hits genetic theory for inherited language impairment: a case study

    Directory of Open Access Journals (Sweden)

    Tracy M Centanni

    2015-08-01

    Full Text Available Communication disorders have complex genetic origins, with constellations of relevant gene markers that vary across individuals. Some genetic variants are present in healthy individuals as well as those affected by developmental disorders. Growing evidence suggests that some variants may increase susceptibility to these disorders in the presence of other pathogenic gene mutations. In the current study, we describe eight children with specific language impairment and four of these children had a copy number variant in one of these potential susceptibility regions on chromosome 15. Three of these four children also had variants in other genes previously associated with language impairment. Our data support the theory that 15q11.2 is a susceptibility region for developmental disorders, specifically language impairment.

  15. Asbestos exposure and health hazards: a global emergency, Epidemiological evidence and denial theories

    Directory of Open Access Journals (Sweden)

    Francesca Zazzara

    2013-01-01

    Full Text Available On June 3rd 2013, in Turin, Italy, the Swiss industrialist Schmidheiny has been sentenced to 18 years imprisonment for intentional disaster for 3,000 asbestos-linked tumours in Italian workers at cement multinational Eternit. The indiscriminate use of asbestos, however, continues worldwide. Although many studies have shown that asbestos is associated with an increased risk of mortality and morbidity, denial theories were spread over time, showing how the logic of profit governs the production of asbestos. We examined the history of the epidemiological evidence of asbestos related risks and, second, the main sources of exposure in Italy and in the world, occupational, non-occupational, and post-disaster exposure (as occurred after L’Aquila earthquake in April 2009. The theme of inequality and social justice is ever so alarming in the fight against asbestos and its lobbies.

  16. Evidence against the continuum structure underlying motivation measures derived from self-determination theory.

    Science.gov (United States)

    Chemolli, Emanuela; Gagné, Marylène

    2014-06-01

    Self-determination theory (SDT) proposes a multidimensional conceptualization of motivation in which the different regulations are said to fall along a continuum of self-determination. The continuum has been used as a basis for using a relative autonomy index as a means to create motivational scores. Rasch analysis was used to verify the continuum structure of the Multidimensional Work Motivation Scale and of the Academic Motivation Scale. We discuss the concept of continuum against SDT's conceptualization of motivation and argue against the use of the relative autonomy index on the grounds that evidence for a continuum structure underlying the regulations is weak and because the index is statistically problematic. We suggest exploiting the full richness of SDT's multidimensional conceptualization of motivation through the use of alternative scoring methods when investigating motivational dynamics across life domains.

  17. Asbestos exposure and health hazards: a global emergency, Epidemiological evidence and denial theories

    Directory of Open Access Journals (Sweden)

    Francesca Zazzara

    2013-11-01

    Full Text Available On June 3rd 2013, in Turin, Italy, the Swiss industrialist Schmidheiny has been sentenced to 18 years imprisonment for intentional disaster for 3,000 asbestos-linked tumours in Italian workers at cement multinational Eternit. The indiscriminate use of asbestos, however, continues worldwide. Although many studies have shown that asbestos is associated with an increased risk of mortality and morbidity, denial theories were spread over time, showing how the logic of profit governs the production of asbestos. We examined the history of the epidemiological evidence of asbestos related risks and, second, the main sources of exposure in Italy and in the world, occupational, non-occupational, and post-disaster exposure (as occurred after L’Aquila earthquake in April 2009. The theme of inequality and social justice is ever so alarming in the fight against asbestos and its lobbies.

  18. Inhibitory processes and cognitive flexibility: evidence for the theory of attentional inertia

    Directory of Open Access Journals (Sweden)

    Isabel Introzzi

    2015-07-01

    Full Text Available The aim of this study was to discriminate the differential contribution of different inhibitory processes -perceptual, cognitive and behavioral inhibition- to switching cost effect associated with alternation cognitive tasks. A correlational design was used. Several experimental paradigms (e.g., Stop signal, visual search, Stemberg´s experimental and Simon paradigm were adapted and included in a computerized program called TAC (Introzzi & Canet Juric, 2014 to the assessment of the different cognitive processes. The final sample consisted of 45 adults (18-50 years. Perceptual and behavioral inhibition shows moderate and low correlations with attentional cost, cognitive inhibition shows no relation with flexibility and only perceptual inhibition predicts switching costs effects, suggesting that different inhibitory processes contribute differentially to switch cost. This could be interpreted as evidence to Attentional Inertia Theory main argument which postulates that inhibition plays an essential role in the ability to flexibly switch between tasks and/or representations.

  19. Educating Occupational Therapists in the Use of Theory and Evidence to Enhance Supervision Practice

    Directory of Open Access Journals (Sweden)

    Melanie J. Roberts

    2017-10-01

    Full Text Available This paper describes the implementation of a unique learning experience aimed at enhancing the quality of supervision practice in occupational therapy at the Gold Coast Hospital and Health Service. The package was designed by experienced occupational therapy educators based on adult, blended, and flipped learning approaches with content developed following administration of a standardized tool and semi-structured interviews. The learning package focused particularly on the logistics of supervision and the use of occupational therapy theory and evidence with supervision. The training for supervising therapists included a workshop and pre and post workshop learning activities. This collaborative research approach to designing and implementing a learning package as well as the specific content of the ongoing education opportunities could also be transferred to other services.

  20. Evidence of quantum phase transition in real-space vacuum entanglement of higher derivative scalar quantum field theories.

    Science.gov (United States)

    Kumar, S Santhosh; Shankaranarayanan, S

    2017-11-17

    In a bipartite set-up, the vacuum state of a free Bosonic scalar field is entangled in real space and satisfies the area-law- entanglement entropy scales linearly with area of the boundary between the two partitions. In this work, we show that the area law is violated in two spatial dimensional model Hamiltonian having dynamical critical exponent z = 3. The model physically corresponds to next-to-next-to-next nearest neighbour coupling terms on a lattice. The result reported here is the first of its kind of violation of area law in Bosonic systems in higher dimensions and signals the evidence of a quantum phase transition. We provide evidence for quantum phase transition both numerically and analytically using quantum Information tools like entanglement spectra, quantum fidelity, and gap in the energy spectra. We identify the cause for this transition due to the accumulation of large number of angular zero modes around the critical point which catalyses the change in the ground state wave function due to the next-to-next-to-next nearest neighbor coupling. Lastly, using Hubbard-Stratanovich transformation, we show that the effective Bosonic Hamiltonian can be obtained from an interacting fermionic theory and provide possible implications for condensed matter systems.

  1. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  2. Fluid analog model for boundary effects in field theory

    International Nuclear Information System (INIS)

    Ford, L. H.; Svaiter, N. F.

    2009-01-01

    Quantum fluctuations in the density of a fluid with a linear phonon dispersion relation are studied. In particular, we treat the changes in these fluctuations due to nonclassical states of phonons and to the presence of boundaries. These effects are analogous to similar effects in relativistic quantum field theory, and we argue that the case of the fluid is a useful analog model for effects in field theory. We further argue that the changes in the mean squared density are, in principle, observable by light scattering experiments.

  3. Synthetic Domain Theory and Models of Linear Abadi & Plotkin Logic

    DEFF Research Database (Denmark)

    Møgelberg, Rasmus Ejlers; Birkedal, Lars; Rosolini, Guiseppe

    2008-01-01

    Plotkin suggested using a polymorphic dual intuitionistic/linear type theory (PILLY) as a metalanguage for parametric polymorphism and recursion. In recent work the first two authors and R.L. Petersen have defined a notion of parametric LAPL-structure, which are models of PILLY, in which one can...... reason using parametricity and, for example, solve a large class of domain equations, as suggested by Plotkin.In this paper, we show how an interpretation of a strict version of Bierman, Pitts and Russo's language Lily into synthetic domain theory presented by Simpson and Rosolini gives rise...

  4. Incorporating Contagion in Portfolio Credit Risk Models Using Network Theory

    NARCIS (Netherlands)

    Anagnostou, I.; Sourabh, S.; Kandhai, D.

    2018-01-01

    Portfolio credit risk models estimate the range of potential losses due to defaults or deteriorations in credit quality. Most of these models perceive default correlation as fully captured by the dependence on a set of common underlying risk factors. In light of empirical evidence, the ability of

  5. Direct evidence for a Coulombic phase in monopole-suppressed SU(2) lattice gauge theory

    International Nuclear Information System (INIS)

    Grady, Michael

    2013-01-01

    Further evidence is presented for the existence of a non-confining phase at weak coupling in SU(2) lattice gauge theory. Using Monte Carlo simulations with the standard Wilson action, gauge-invariant SO(3)–Z2 monopoles, which are strong-coupling lattice artifacts, have been seen to undergo a percolation transition exactly at the phase transition previously seen using Coulomb gauge methods, with an infinite lattice critical point near β=3.2. The theory with both Z2 vortices and monopoles and SO(3)–Z2 monopoles eliminated is simulated in the strong-coupling (β=0) limit on lattices up to 60 4 . Here, as in the high-β phase of the Wilson-action theory, finite size scaling shows it spontaneously breaks the remnant symmetry left over after Coulomb gauge fixing. Such a symmetry breaking precludes the potential from having a linear term. The monopole restriction appears to prevent the transition to a confining phase at any β. Direct measurement of the instantaneous Coulomb potential shows a Coulombic form with moderately running coupling possibly approaching an infrared fixed point of α∼1.4. The Coulomb potential is measured to 50 lattice spacings and 2 fm. A short-distance fit to the 2-loop perturbative potential is used to set the scale. High precision at such long distances is made possible through the use of open boundary conditions, which was previously found to cut random and systematic errors of the Coulomb gauge fixing procedure dramatically. The Coulomb potential agrees with the gauge-invariant interquark potential measured with smeared Wilson loops on periodic lattices as far as the latter can be practically measured with similar statistics data

  6. Routine outcome monitoring and feedback on physical or mental health status: evidence and theory.

    Science.gov (United States)

    Carlier, Ingrid V E; Meuldijk, Denise; Van Vliet, Irene M; Van Fenema, Esther; Van der Wee, Nic J A; Zitman, Frans G

    2012-02-01

    Routine Outcome Monitoring (ROM) is an important quality tool for measuring outcome of treatment in health care. The objective of this article is to summarize the evidence base that supports the provision of feedback on ROM results to (mental) health care professionals and patients. Also, some relevant theoretical aspects are considered. Literature study (Pubmed, Medline, PsychINFO, Embase Psychiatry, 1975-2009) concerning randomized controlled trials (RTC's) of ROM and feedback on physical or mental health status of patients of all ages. Main search terms were routine outcome monitoring/measurement, feedback, health status measurement, patient reported outcome measures. Included were 52 RCT's concerning ROM and feedback with adult or older patients: of these seven RCT's were exclusively focused on physical health and 45 RCT's (also) on the mental health of the patient, although not always in a mental health care setting or as primary outcome measure. There appears to be a positive impact of ROM on diagnosis and monitoring of treatment, and on communication between patient and therapist. Other results were less clear. There were no published RCT's on this topic with children or adolescents. ROM appears especially effective for the monitoring of patients who are not doing well in therapy. Further research into this topic and the clinical-and cost-effectiveness of ROM is recommended, especially in mental health care for both adults and children. Also, more theory-driven research is needed with relevant conceptualizations such as Feedback Intervention Theory, Therapeutic Assessment. © 2010 Blackwell Publishing Ltd.

  7. Adapting evidence-based interventions using a common theory, practices, and principles.

    Science.gov (United States)

    Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D

    2014-01-01

    Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed.

  8. Combining evidence and diffusion of innovation theory to enhance influenza immunization.

    Science.gov (United States)

    Britto, Maria T; Pandzik, Geralyn M; Meeks, Connie S; Kotagal, Uma R

    2006-08-01

    Children and adolescents with chronic conditions such as asthma, diabetes, and HIV are at high risk of influenza-related morbidity, and there are recommendations to immunize these populations annually. At Cincinnati Children's Hospital Medical Center, the influenza immunization rate increased to 90.4% (5% declined) among 200 patients with cystic fibrosis (CF). Diffusion of innovation theory was used to guide the design and implementation of spread to other clinics. The main intervention strategies were: (1) engagement of interested, nurse-led teams, (2) A collaborative learning session, (3) A tool kit including literature, sample goals, reminder postcards, communication strategies, and team member roles and processes, (4) open-access scheduling and standing orders (5) A simple Web-based registry, (6) facilitated vaccine ordering, (7) recall phone calls, and (8) weekly results posting. Clinic-specific immunization rates ranged from 32.7% to 92.8%, with the highest rate reported in the CF clinic. All teams used multiple strategies; with six of the seven using four or more. Overall, 60.0% (762/1,269) of the population was immunized. Barriers included vaccine shortages, lack of time for reminder calls, and lack of physician support in one clinic. A combination of interventions, guided by evidence and diffusion of innovation theory, led to immunization rates higher than those reported in the literature.

  9. Location Decisions of U.S. Polluting Plants. Theory, Empirical Evidence, and Consequences

    International Nuclear Information System (INIS)

    Shadbegian, R.; Wolverton, A.

    2010-01-01

    Economists have long been interested in explaining the spatial distribution of economic activity, focusing on what factors motivate profit-maximizing firms when they choose to open a new plant or expand an existing facility. We begin our paper with a general discussion of the theory of plant location, including the role of taxes and agglomeration economies. However, our paper focuses on the theory, evidence, and implications of the role of environmental regulations in plant location decisions. On its face, environmental regulation would not necessarily be expected to alter location decisions, since we would expect Federal regulation to affect all locations in the United States essentially equally. It turns out, however, that this is not always the case as some geographic areas are subject to greater stringency. Another source of variation is differences across states in the way they implement and enforce compliance with Federal regulation. In light of these spatial differences in the costs of complying with environmental regulations, we discuss three main questions in this survey: Do environmental regulations affect the location decisions of polluting plants? Do states compete for polluting plants through differences in environmental regulation? And, do firms locate polluting plants disproportionately near poor and minority neighborhoods?.

  10. The symbol-grounding problem in numerical cognition: A review of theory, evidence, and outstanding questions.

    Science.gov (United States)

    Leibovich, Tali; Ansari, Daniel

    2016-03-01

    How do numerical symbols, such as number words, acquire semantic meaning? This question, also referred to as the "symbol-grounding problem," is a central problem in the field of numerical cognition. Present theories suggest that symbols acquire their meaning by being mapped onto an approximate system for the nonsymbolic representation of number (Approximate Number System or ANS). In the present literature review, we first asked to which extent current behavioural and neuroimaging data support this theory, and second, to which extent the ANS, upon which symbolic numbers are assumed to be grounded, is numerical in nature. We conclude that (a) current evidence that has examined the association between the ANS and number symbols does not support the notion that number symbols are grounded in the ANS and (b) given the strong correlation between numerosity and continuous variables in nonsymbolic number processing tasks, it is next to impossible to measure the pure association between symbolic and nonsymbolic numerosity. Instead, it is clear that significant cognitive control resources are required to disambiguate numerical from continuous variables during nonsymbolic number processing. Thus, if there exists any mapping between the ANS and symbolic number, then this process of association must be mediated by cognitive control. Taken together, we suggest that studying the role of both cognitive control and continuous variables in numerosity comparison tasks will provide a more complete picture of the symbol-grounding problem. (c) 2016 APA, all rights reserved).

  11. Should the model for risk-informed regulation be game theory rather than decision theory?

    Science.gov (United States)

    Bier, Vicki M; Lin, Shi-Woei

    2013-02-01

    deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.

  12. Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models

    Directory of Open Access Journals (Sweden)

    Hugo U. R. Strand

    2015-03-01

    Full Text Available We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium “phase diagrams” that map out the different dynamical regimes.

  13. An intelligent diagnosis model based on rough set theory

    Science.gov (United States)

    Li, Ze; Huang, Hong-Xing; Zheng, Ye-Lu; Wang, Zhou-Yuan

    2013-03-01

    Along with the popularity of computer and rapid development of information technology, how to increase the accuracy of the agricultural diagnosis becomes a difficult problem of popularizing the agricultural expert system. Analyzing existing research, baseing on the knowledge acquisition technology of rough set theory, towards great sample data, we put forward a intelligent diagnosis model. Extract rough set decision table from the samples property, use decision table to categorize the inference relation, acquire property rules related to inference diagnosis, through the means of rough set knowledge reasoning algorithm to realize intelligent diagnosis. Finally, we validate this diagnosis model by experiments. Introduce the rough set theory to provide the agricultural expert system of great sample data a effective diagnosis model.

  14. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  15. Integrable lambda models and Chern-Simons theories

    Science.gov (United States)

    Schmidtt, David M.

    2017-05-01

    In this note we reveal a connection between the phase space of lambda models on {S}^1× R and the phase space of double Chern-Simons theories on D× R and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 × S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2\\Big|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  16. sigma model approach to the heterotic string theory

    International Nuclear Information System (INIS)

    Sen, A.

    1985-09-01

    Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs

  17. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  18. Matrix models and stochastic growth in Donaldson-Thomas theory

    International Nuclear Information System (INIS)

    Szabo, Richard J.; Tierz, Miguel

    2012-01-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  19. Latent factor modeling of four schizotypy dimensions with theory of mind and empathy.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Bedwell

    Full Text Available Preliminary evidence suggests that theory of mind and empathy relate differentially to factors of schizotypy. The current study assessed 686 undergraduate students and used structural equation modeling to examine links between a four-factor model of schizotypy with performance on measures of theory of mind (Reading the Mind in the Eyes Test [MIE] and empathy (Interpersonal Reactivity Index [IRI]. Schizotypy was assessed using three self-report measures which were simultaneously entered into the model. Results revealed that the Negative factor of schizotypy showed a negative relationship with the Empathy factor, which was primarily driven by the Empathic Concern subscale of the IRI and the No Close Friends and Constricted Affect subscales of the Schizotypal Personality Questionnaire. These findings are consistent with a growing body of literature suggesting a relatively specific relationship between negative schizotypy and empathy, and are consistent with several previous studies that found no relationship between MIE performance and schizotypy.

  20. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of lumbar spine x-ray for low back pain in UK primary care practice.

    Science.gov (United States)

    Grimshaw, Jeremy M; Eccles, Martin P; Steen, Nick; Johnston, Marie; Pitts, Nigel B; Glidewell, Liz; Maclennan, Graeme; Thomas, Ruth; Bonetti, Debbie; Walker, Anne

    2011-05-28

    Psychological models predict behaviour in a wide range of settings. The aim of this study was to explore the usefulness of a range of psychological models to predict the health professional behaviour 'referral for lumbar spine x-ray in patients presenting with low back pain' by UK primary care physicians. Psychological measures were collected by postal questionnaire survey from a random sample of primary care physicians in Scotland and north England. The outcome measures were clinical behaviour (referral rates for lumbar spine x-rays), behavioural simulation (lumbar spine x-ray referral decisions based upon scenarios), and behavioural intention (general intention to refer for lumbar spine x-rays in patients with low back pain). Explanatory variables were the constructs within the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-Regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Weinstein's Stage Model termed the Precaution Adoption Process (PAP), and knowledge. For each of the outcome measures, a generalised linear model was used to examine the predictive value of each theory individually. Linear regression was used for the intention and simulation outcomes, and negative binomial regression was used for the behaviour outcome. Following this 'theory level' analysis, a 'cross-theoretical construct' analysis was conducted to investigate the combined predictive value of all individual constructs across theories. Constructs from TPB, SCT, CS-SRM, and OLT predicted behaviour; however, the theoretical models did not fit the data well. When predicting behavioural simulation, the proportion of variance explained by individual theories was TPB 11.6%, SCT 12.1%, OLT 8.1%, and II 1.5% of the variance, and in the cross-theory analysis constructs from TPB, CS-SRM and II explained 16.5% of the variance in simulated behaviours. When predicting intention, the proportion of variance explained by individual

  1. Localization landscape theory of disorder in semiconductors. I. Theory and modeling

    Science.gov (United States)

    Filoche, Marcel; Piccardo, Marco; Wu, Yuh-Renn; Li, Chi-Kang; Weisbuch, Claude; Mayboroda, Svitlana

    2017-04-01

    We present here a model of carrier distribution and transport in semiconductor alloys accounting for quantum localization effects in disordered materials. This model is based on the recent development of a mathematical theory of quantum localization which introduces for each type of carrier a spatial function called localization landscape. These landscapes allow us to predict the localization regions of electron and hole quantum states, their corresponding energies, and the local densities of states. We show how the various outputs of these landscapes can be directly implemented into a drift-diffusion model of carrier transport and into the calculation of absorption/emission transitions. This creates a new computational model which accounts for disorder localization effects while also capturing two major effects of quantum mechanics, namely, the reduction of barrier height (tunneling effect) and the raising of energy ground states (quantum confinement effect), without having to solve the Schrödinger equation. Finally, this model is applied to several one-dimensional structures such as single quantum wells, ordered and disordered superlattices, or multiquantum wells, where comparisons with exact Schrödinger calculations demonstrate the excellent accuracy of the approximation provided by the landscape theory.

  2. Designing theoretically-informed implementation interventions: Fine in theory, but evidence of effectiveness in practice is needed

    Directory of Open Access Journals (Sweden)

    Reeves Scott

    2006-02-01

    Full Text Available Abstract The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.

  3. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  4. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence.

    Science.gov (United States)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  5. Magnetized cosmological models in Saez Ballester theory of gravitation

    International Nuclear Information System (INIS)

    Katore, S.D.; Shaikh, A.Y.

    2014-01-01

    Bianchi type-I magnetized cosmological model in scalar tensor theory proposed by Saez and Ballester (Phys. Lett. A 113 (1986) 467) with perfect fluid as a source is investigated. The source of the magnetic field is due to an electric current produced along the x-axis. F 23 is the non-vanishing component of electromagnetic field tensor. To get deterministic model, it has been assumed that the component σ 1 1 , of eigenvalue of shear tensor σ j i is proportional to expansion scalar θ. The behavior of models in presence and absence of magnetic field with physical properties are discussed

  6. Multilane Traffic Flow Modeling Using Cellular Automata Theory

    Science.gov (United States)

    Chechina, Antonina; Churbanova, Natalia; Trapeznikova, Marina

    2018-02-01

    The paper deals with the mathematical modeling of traffic flows on urban road networks using microscopic approach. The model is based on the cellular automata theory and presents a generalization of the Nagel-Schreckenberg model to a multilane case. The created program package allows to simulate traffic on various types of road fragments (T or X type intersection, strait road elements, etc.) and on road networks that consist of these elements. Besides that, it allows to predict the consequences of various decisions regarding road infrastructure changes, such as: number of lanes increasing/decreasing, putting new traffic lights into operation, building new roads, entrances/exits, road junctions.

  7. Grassmann phase space theory and the Jaynes–Cummings model

    International Nuclear Information System (INIS)

    Dalton, B.J.; Garraway, B.M.; Jeffers, J.; Barnett, S.M.

    2013-01-01

    The Jaynes–Cummings model of a two-level atom in a single mode cavity is of fundamental importance both in quantum optics and in quantum physics generally, involving the interaction of two simple quantum systems—one fermionic system (the TLA), the other bosonic (the cavity mode). Depending on the initial conditions a variety of interesting effects occur, ranging from ongoing oscillations of the atomic population difference at the Rabi frequency when the atom is excited and the cavity is in an n-photon Fock state, to collapses and revivals of these oscillations starting with the atom unexcited and the cavity mode in a coherent state. The observation of revivals for Rydberg atoms in a high-Q microwave cavity is key experimental evidence for quantisation of the EM field. Theoretical treatments of the Jaynes–Cummings model based on expanding the state vector in terms of products of atomic and n-photon states and deriving coupled equations for the amplitudes are a well-known and simple method for determining the effects. In quantum optics however, the behaviour of the bosonic quantum EM field is often treated using phase space methods, where the bosonic mode annihilation and creation operators are represented by c-number phase space variables, with the density operator represented by a distribution function of these variables. Fokker–Planck equations for the distribution function are obtained, and either used directly to determine quantities of experimental interest or used to develop c-number Langevin equations for stochastic versions of the phase space variables from which experimental quantities are obtained as stochastic averages. Phase space methods have also been developed to include atomic systems, with the atomic spin operators being represented by c-number phase space variables, and distribution functions involving these variables and those for any bosonic modes being shown to satisfy Fokker–Planck equations from which c-number Langevin equations are

  8. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... field e.g. describing mesons or phonons. The models are given by three inputs: - the dispersion relation for the matter particle, - the dispersion relation for the field particle, and - the (UV cut-off) coupling function. The assumptions imposed on , and are rather weak and are satisfied...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...

  9. Approximate models for broken clouds in stochastic radiative transfer theory

    International Nuclear Information System (INIS)

    Doicu, Adrian; Efremenko, Dmitry S.; Loyola, Diego; Trautmann, Thomas

    2014-01-01

    This paper presents approximate models in stochastic radiative transfer theory. The independent column approximation and its modified version with a solar source computed in a full three-dimensional atmosphere are formulated in a stochastic framework and for arbitrary cloud statistics. The nth-order stochastic models describing the independent column approximations are equivalent to the nth-order stochastic models for the original radiance fields in which the gradient vectors are neglected. Fast approximate models are further derived on the basis of zeroth-order stochastic models and the independent column approximation. The so-called “internal mixing” models assume a combination of the optical properties of the cloud and the clear sky, while the “external mixing” models assume a combination of the radiances corresponding to completely overcast and clear skies. A consistent treatment of internal and external mixing models is provided, and a new parameterization of the closure coefficient in the effective thickness approximation is given. An efficient computation of the closure coefficient for internal mixing models, using a previously derived vector stochastic model as a reference, is also presented. Equipped with appropriate look-up tables for the closure coefficient, these models can easily be integrated into operational trace gas retrieval systems that exploit absorption features in the near-IR solar spectrum. - Highlights: • Independent column approximation in a stochastic setting. • Fast internal and external mixing models for total and diffuse radiances. • Efficient optimization of internal mixing models to match reference models

  10. Circuit theory and model-based inference for landscape connectivity

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.

    2013-01-01

    Circuit theory has seen extensive recent use in the field of ecology, where it is often applied to study functional connectivity. The landscape is typically represented by a network of nodes and resistors, with the resistance between nodes a function of landscape characteristics. The effective distance between two locations on a landscape is represented by the resistance distance between the nodes in the network. Circuit theory has been applied to many other scientific fields for exploratory analyses, but parametric models for circuits are not common in the scientific literature. To model circuits explicitly, we demonstrate a link between Gaussian Markov random fields and contemporary circuit theory using a covariance structure that induces the necessary resistance distance. This provides a parametric model for second-order observations from such a system. In the landscape ecology setting, the proposed model provides a simple framework where inference can be obtained for effects that landscape features have on functional connectivity. We illustrate the approach through a landscape genetics study linking gene flow in alpine chamois (Rupicapra rupicapra) to the underlying landscape.

  11. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  12. Two velocity difference model for a car following theory

    Science.gov (United States)

    Ge, H. X.; Cheng, R. J.; Li, Z. P.

    2008-09-01

    In the light of the optimal velocity model, a two velocity difference model for a car-following theory is put forward considering navigation in modern traffic. To our knowledge, the model is an improvement over the previous ones theoretically, because it considers more aspects in the car-following process than others. Then we investigate the property of the model using linear and nonlinear analyses. The Korteweg-de Vries equation (for short, the KdV equation) near the neutral stability line and the modified Korteweg-de Vries equation (for short, the mKdV equation) around the critical point are derived by applying the reductive perturbation method. The traffic jam could be thus described by the KdV soliton and the kink-anti-kink soliton for the KdV equation and mKdV equation, respectively. Numerical simulations are made to verify the model, and good results are obtained with the new model.

  13. Nonlinear structural mechanics theory, dynamical phenomena and modeling

    CERN Document Server

    Lacarbonara, Walter

    2013-01-01

    Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...

  14. A Novel Evidence Theory and Fuzzy Preference Approach-Based Multi-Sensor Data Fusion Technique for Fault Diagnosis.

    Science.gov (United States)

    Xiao, Fuyuan

    2017-10-31

    The multi-sensor data fusion technique plays a significant role in fault diagnosis and in a variety of such applications, and the Dempster-Shafer evidence theory is employed to improve the system performance; whereas, it may generate a counter-intuitive result when the pieces of evidence highly conflict with each other. To handle this problem, a novel multi-sensor data fusion approach on the basis of the distance of evidence, belief entropy and fuzzy preference relation analysis is proposed. A function of evidence distance is first leveraged to measure the conflict degree among the pieces of evidence; thus, the support degree can be obtained to represent the reliability of the evidence. Next, the uncertainty of each piece of evidence is measured by means of the belief entropy. Based on the quantitative uncertainty measured above, the fuzzy preference relations are applied to represent the relative credibility preference of the evidence. Afterwards, the support degree of each piece of evidence is adjusted by taking advantage of the relative credibility preference of the evidence that can be utilized to generate an appropriate weight with respect to each piece of evidence. Finally, the modified weights of the evidence are adopted to adjust the bodies of the evidence in the advance of utilizing Dempster's combination rule. A numerical example and a practical application in fault diagnosis are used as illustrations to demonstrate that the proposal is reasonable and efficient in the management of conflict and fault diagnosis.

  15. A Novel Evidence Theory and Fuzzy Preference Approach-Based Multi-Sensor Data Fusion Technique for Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2017-10-01

    Full Text Available The multi-sensor data fusion technique plays a significant role in fault diagnosis and in a variety of such applications, and the Dempster–Shafer evidence theory is employed to improve the system performance; whereas, it may generate a counter-intuitive result when the pieces of evidence highly conflict with each other. To handle this problem, a novel multi-sensor data fusion approach on the basis of the distance of evidence, belief entropy and fuzzy preference relation analysis is proposed. A function of evidence distance is first leveraged to measure the conflict degree among the pieces of evidence; thus, the support degree can be obtained to represent the reliability of the evidence. Next, the uncertainty of each piece of evidence is measured by means of the belief entropy. Based on the quantitative uncertainty measured above, the fuzzy preference relations are applied to represent the relative credibility preference of the evidence. Afterwards, the support degree of each piece of evidence is adjusted by taking advantage of the relative credibility preference of the evidence that can be utilized to generate an appropriate weight with respect to each piece of evidence. Finally, the modified weights of the evidence are adopted to adjust the bodies of the evidence in the advance of utilizing Dempster’s combination rule. A numerical example and a practical application in fault diagnosis are used as illustrations to demonstrate that the proposal is reasonable and efficient in the management of conflict and fault diagnosis.

  16. Models, Mechanisms and Moderators Dissociating Empathy and Theory of Mind.

    Science.gov (United States)

    Kanske, Philipp; Böckler, Anne; Singer, Tania

    Most instances of social interaction provide a wealth of information about the states of other people, be it sensations, feelings, thoughts, or convictions. How we represent these states has been a major question in social neuroscience, leading to the identification of two routes to understanding others: an affective route for the direct sharing of others' emotions (empathy) that involves, among others, anterior insula and middle anterior cingulate cortex and a cognitive route for representing and reasoning about others' states (Theory of Mind) that entails, among others, ventral temporoparietal junction and anterior and posterior midline regions. Additionally, research has revealed a number of situational and personal factors that shape the functioning of empathy and Theory of Mind. Concerning situational modulators, it has been shown, for instance, that ingroup membership enhances empathic responding and that Theory of Mind performance seems to be susceptible to stress. Personal modulators include psychopathological conditions, for which alterations in empathy and mentalizing have consistently been demonstrated; people on the autism spectrum, for instance, are impaired specifically in mentalizing, while spontaneous empathic responding seems selectively reduced in psychopathy. Given the multifaceted evidence for separability of the two routes, current research endeavors aiming at fostering interpersonal cooperation explore the differential malleability of affective and cognitive understanding of others.

  17. A realistic model for quantum theory with a locality property

    International Nuclear Information System (INIS)

    Eberhard, P.H.

    1987-04-01

    A model reproducing the predictions of relativistic quantum theory to any desired degree of accuracy is described in this paper. It involves quantities that are independent of the observer's knowledge, and therefore can be called real, and which are defined at each point in space, and therefore can be called local in a rudimentary sense. It involves faster-than-light, but not instantaneous, action at distance

  18. A formal model of theory choice in science

    OpenAIRE

    William A. Brock; Steven N. Durlauf

    1999-01-01

    Since the work of Thomas Kuhn, the role of social factors in the scientific enterprise has been a major concern in the philosophy and history of science. In particular, conformity effects among scientists have been used to question whether science naturally progresses over time. Using neoclassical economic reasoning, this paper develops a formal model of scientific theory choice which incorporates social factors. Our results demonstrate that the influence of social factors on scientific progr...

  19. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  20. Non-static plane symmetric cosmological model in Wesson's theory

    Indian Academy of Sciences (India)

    geodesic path in this theory. Further, it is observed that ρm →∞ as t →0 and ρm →0 as t →∞, which indicates that there is a Big Bang-like singularity at the initial epoch. 5. Conclusions. The non-static plane symmetric cosmological model constructed here expands with increase of time and the rate of expansion is slow with ...

  1. Game Theory Models for Multi-Robot Patrolling of Infrastructures

    Directory of Open Access Journals (Sweden)

    Erik Hernández

    2013-03-01

    Full Text Available This work is focused on the problem of performing multi-robot patrolling for infrastructure security applications in order to protect a known environment at critical facilities. Thus, given a set of robots and a set of points of interest, the patrolling task consists of constantly visiting these points at irregular time intervals for security purposes. Current existing solutions for these types of applications are predictable and inflexible. Moreover, most of the previous work has tackled the patrolling problem with centralized and deterministic solutions and only few efforts have been made to integrate dynamic methods. Therefore, one of the main contributions of this work is the development of new dynamic and decentralized collaborative approaches in order to solve the aforementioned problem by implementing learning models from Game Theory. The model selected in this work that includes belief-based and reinforcement models as special cases is called Experience-Weighted Attraction. The problem has been defined using concepts of Graph Theory to represent the environment in order to work with such Game Theory techniques. Finally, the proposed methods have been evaluated experimentally by using a patrolling simulator. The results obtained have been compared with previous available approaches.

  2. Quantile hydrologic model selection and model structure deficiency assessment : 1. Theory

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies

  3. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its...

  4. Application of the evolution theory in modelling of innovation diffusion

    Directory of Open Access Journals (Sweden)

    Krstić Milan

    2016-01-01

    Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.

  5. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 2: Building a Culture of Inquiry

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-09-01

    Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action

  6. On the Complexity of Item Response Theory Models.

    Science.gov (United States)

    Bonifay, Wes; Cai, Li

    2017-01-01

    Complexity in item response theory (IRT) has traditionally been quantified by simply counting the number of freely estimated parameters in the model. However, complexity is also contingent upon the functional form of the model. We examined four popular IRT models-exploratory factor analytic, bifactor, DINA, and DINO-with different functional forms but the same number of free parameters. In comparison, a simpler (unidimensional 3PL) model was specified such that it had 1 more parameter than the previous models. All models were then evaluated according to the minimum description length principle. Specifically, each model was fit to 1,000 data sets that were randomly and uniformly sampled from the complete data space and then assessed using global and item-level fit and diagnostic measures. The findings revealed that the factor analytic and bifactor models possess a strong tendency to fit any possible data. The unidimensional 3PL model displayed minimal fitting propensity, despite the fact that it included an additional free parameter. The DINA and DINO models did not demonstrate a proclivity to fit any possible data, but they did fit well to distinct data patterns. Applied researchers and psychometricians should therefore consider functional form-and not goodness-of-fit alone-when selecting an IRT model.

  7. H+3 WZNW model from Liouville field theory

    International Nuclear Information System (INIS)

    Hikida, Yasuaki; Schomerus, Volker

    2007-01-01

    There exists an intriguing relation between genus zero correlation functions in the H + 3 WZNW model and in Liouville field theory. We provide a path integral derivation of the correspondence and then use our new approach to generalize the relation to surfaces of arbitrary genus g. In particular we determine the correlation functions of N primary fields in the WZNW model explicitly through Liouville correlators with N+2g-2 additional insertions of certain degenerate fields. The paper concludes with a list of interesting further extensions and a few comments on the relation to the geometric Langlands program

  8. Theory and Circuit Model for Lossy Coaxial Transmission Line

    Energy Technology Data Exchange (ETDEWEB)

    Genoni, T. C.; Anderson, C. N.; Clark, R. E.; Gansz-Torres, J.; Rose, D. V.; Welch, Dale Robert

    2017-04-01

    The theory of signal propagation in lossy coaxial transmission lines is revisited and new approximate analytic formulas for the line impedance and attenuation are derived. The accuracy of these formulas from DC to 100 GHz is demonstrated by comparison to numerical solutions of the exact field equations. Based on this analysis, a new circuit model is described which accurately reproduces the line response over the entire frequency range. Circuit model calculations are in excellent agreement with the numerical and analytic results, and with finite-difference-time-domain simulations which resolve the skindepths of the conducting walls.

  9. Models and applications of chaos theory in modern sciences

    CERN Document Server

    Zeraoulia, Elhadj

    2011-01-01

    This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli

  10. Flipped classroom model for learning evidence-based medicine.

    Science.gov (United States)

    Rucker, Sydney Y; Ozdogan, Zulfukar; Al Achkar, Morhaf

    2017-01-01

    Journal club (JC), as a pedagogical strategy, has long been used in graduate medical education (GME). As evidence-based medicine (EBM) becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice.

  11. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  12. Correlators in integrable quantum field theory: the scaling RSOS models

    International Nuclear Information System (INIS)

    The study of the scaling limit of two-dimensional models of statistical mechanics within the framework of integrable field theory is illustrated through the example of the RSOS models. Starting from the exact description of regime III in terms of colliding particles, we compute the correlation functions of the thermal, phi (cursive,open) Greek 1,2 and (for some cases) spin operators in the two-particle approximation. The accuracy obtained for the moments of these correlators is analysed by computing the central charge and the scaling dimensions and comparing with the exact results. We further consider the (generally non-integrable) perturbation of the critical points with both the operators phi (cursive,open) Greek 1,3 and phi (cursive,open) Greek 1,2 and locate the branches solved on the lattice within the associated two-dimensional phase diagram. Finally we discuss the fact that the RSOS models, the dilute q-state Potts model at and the O(n) vector model are all described by the same perturbed conformal field theory

  13. An Organizational Model to Distinguish between and Integrate Research and Evaluation Activities in a Theory Based Evaluation

    Science.gov (United States)

    Sample McMeeking, Laura B.; Basile, Carole; Cobb, R. Brian

    2012-01-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its…

  14. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    Science.gov (United States)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  15. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    in the economic model implies the econometric concept of strong exogeneity for ß. The economic equilibrium corresponds to the so-called long-run value (Johansen 2005), the comparative statics are captured by the long-run impact matrix, C; and the exogenous variables are the common trends. Also, the adjustment...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....

  16. Item level diagnostics and model - data fit in item response theory ...

    African Journals Online (AJOL)

    Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...

  17. General topology meets model theory, on p and t.

    Science.gov (United States)

    Malliaris, Maryanthe; Shelah, Saharon

    2013-08-13

    Cantor proved in 1874 [Cantor G (1874) J Reine Angew Math 77:258-262] that the continuum is uncountable, and Hilbert's first problem asks whether it is the smallest uncountable cardinal. A program arose to study cardinal invariants of the continuum, which measure the size of the continuum in various ways. By Gödel [Gödel K (1939) Proc Natl Acad Sci USA 25(4):220-224] and Cohen [Cohen P (1963) Proc Natl Acad Sci USA 50(6):1143-1148], Hilbert's first problem is independent of ZFC (Zermelo-Fraenkel set theory with the axiom of choice). Much work both before and since has been done on inequalities between these cardinal invariants, but some basic questions have remained open despite Cohen's introduction of forcing. The oldest and perhaps most famous of these is whether " p = t," which was proved in a special case by Rothberger [Rothberger F (1948) Fund Math 35:29-46], building on Hausdorff [Hausdorff (1936) Fund Math 26:241-255]. In this paper we explain how our work on the structure of Keisler's order, a large-scale classification problem in model theory, led to the solution of this problem in ZFC as well as of an a priori unrelated open question in model theory.

  18. Standard Model in multiscale theories and observational constraints

    Science.gov (United States)

    Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David

    2016-08-01

    We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*35 MeV . For α0=1 /2 , the Lamb shift alone yields t*450 GeV .

  19. Theory and evidence of economies of scale in the development of waste management systems

    International Nuclear Information System (INIS)

    Chang, Shoou-Yuh; Rivera, A.L.

    1989-01-01

    Waste is a cost of doing business. This cost can be considered in terms of the potential adverse health and environmental impacts, or the waste management costs associated with avoiding, minimizing, and controlling those impacts. There is an anticipated increase in the cost of waste management as a result of the increasing requirements for regulatory compliance. To meet the total waste management capacity needs of the organization and the compliance requirements, low-level radioactive, hazardous, and mixed waste management will need demonstrated technologies strategically managed as a technology portfolio. The role of the decision maker is to select the optimum mix of technologies and facilities to provide the waste management capacity needed for the next twenty years. The waste management system resulting from this mix includes multiple small-scale fixed facilities, large-scale centralized facilities, and waste management subcontracts. This study was conducted to examine the theory and evidence of economies of scale in the development of waste management systems as as exploratory research on the economic considerations in the process of technology selection and implementation. 25 refs., 24 figs., 11 tabs

  20. Sensor Data Fusion for Accurate Cloud Presence Prediction Using Dempster-Shafer Evidence Theory

    Directory of Open Access Journals (Sweden)

    Jesse S. Jin

    2010-10-01

    Full Text Available Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.

  1. The linear model and hypothesis a general unifying theory

    CERN Document Server

    Seber, George

    2015-01-01

    This book provides a concise and integrated overview of hypothesis testing in four important subject areas, namely linear and nonlinear models, multivariate analysis, and large sample theory. The approach used is a geometrical one based on the concept of projections and their associated idempotent matrices, thus largely avoiding the need to involve matrix ranks. It is shown that all the hypotheses encountered are either linear or asymptotically linear, and that all the underlying models used are either exactly or asymptotically linear normal models. This equivalence can be used, for example, to extend the concept of orthogonality in the analysis of variance to other models, and to show that the asymptotic equivalence of the likelihood ratio, Wald, and Score (Lagrange Multiplier) hypothesis tests generally applies.

  2. Effective-field theory on the kinetic Ising model

    International Nuclear Information System (INIS)

    Shi Xiaoling; Wei Guozhu; Li Lin

    2008-01-01

    As an analytical method, the effective-field theory (EFT) is used to study the dynamical response of the kinetic Ising model in the presence of a sinusoidal oscillating field. The effective-field equations of motion of the average magnetization are given for the square lattice (Z=4) and the simple cubic lattice (Z=6), respectively. The dynamic order parameter, the hysteresis loop area and the dynamic correlation are calculated. In the field amplitude h 0 /ZJ-temperature T/ZJ plane, the phase boundary separating the dynamic ordered and the disordered phase has been drawn, and the dynamical tricritical point has been observed. We also make the compare results of EFT with that given by using the mean field theory (MFT)

  3. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  4. Physics of human cooperation: experimental evidence and theoretical models

    Science.gov (United States)

    Sánchez, Angel

    2018-02-01

    In recent years, many physicists have used evolutionary game theory combined with a complex systems perspective in an attempt to understand social phenomena and challenges. Prominent among such phenomena is the issue of the emergence and sustainability of cooperation in a networked world of selfish or self-focused individuals. The vast majority of research done by physicists on these questions is theoretical, and is almost always posed in terms of agent-based models. Unfortunately, more often than not such models ignore a number of facts that are well established experimentally, and are thus rendered irrelevant to actual social applications. I here summarize some of the facts that any realistic model should incorporate and take into account, discuss important aspects underlying the relation between theory and experiments, and discuss future directions for research based on the available experimental knowledge.

  5. Aligning method with theory: a comparison of two approaches to modeling the social determinants of health.

    Science.gov (United States)

    O'Campo, Patricia; Urquia, Marcelo

    2012-12-01

    There is increasing interest in the study of the social determinants of maternal and child health. While there has been growth in the theory and empirical evidence about social determinants, less attention has been paid to the kind of modeling that should be used to understand the impact of social exposures on well-being. We analyzed data from the nationwide 2006 Canadian Maternity Experiences Survey to compare the pervasive disease-specific model to a model that captures the generalized health impact (GHI) of social exposures, namely low socioeconomic position. The GHI model uses a composite of adverse conditions that stem from low socioeconomic position: adverse birth outcomes, postpartum depression, severe abuse, stressful life events, and hospitalization during pregnancy. Adjusted prevalence ratios and 95% confidence intervals from disease-specific models for low income (social determinants of health.

  6. Inadequate Evidence for Multiple Intelligences, Mozart Effect, and Emotional Intelligence Theories

    Science.gov (United States)

    Waterhouse, Lynn

    2006-01-01

    I (Waterhouse, 2006) argued that, because multiple intelligences, the Mozart effect, and emotional intelligence theories have inadequate empirical support and are not consistent with cognitive neuroscience findings, these theories should not be applied in education. Proponents countered that their theories had sufficient empirical support, were…

  7. Adapting Structuration Theory as a Comprehensive Theory for Distance Education: The ASTIDE Model

    Science.gov (United States)

    Aktaruzzaman, Md; Plunkett, Margaret

    2016-01-01

    Distance Education (DE) theorists have argued about the requirement for a theory to be comprehensive in a way that can explicate many of the activities associated with DE. Currently, Transactional Distance Theory (TDT) (Moore, 1993) and the Theory of Instructional Dialogue (IDT) (Caspi & Gorsky, 2006) are the most prominent theories, yet they…

  8. Superstring theory

    International Nuclear Information System (INIS)

    Schwarz, J.H.

    1985-01-01

    Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions

  9. Evidence of institutionalizing elements in the Balanced Scorecard in the book Strategy in action: a view based on institutional theory

    Directory of Open Access Journals (Sweden)

    Paschoal Tadeu Russo

    2012-04-01

    Full Text Available The Balanced Scorecard (BSC is a methodology that allows managers to define and implement a set of financial or nonfinancial indicators in a balanced way to assess an organization's performance from four viewpoints. Many companies are unsuccessful in their implementation of the BSC. This lack of success may be attributed to different factors, such as strategic problems, planning failures, and poorly defined targets and goals. However, the failed implementation may be attributed in part to the failure to institutionalize habits and routines. In this regard, this objective of this paper is to use institutional theory to determine whether the book Strategy in Action: Balanced Scorecard contains evidence that the BSC model proposed by the authors (Kaplan & Norton includes elements that favor the model's institutionalization. For this purpose, a qualitative bibliographic survey was prepared. The survey revealed 404 clues that were rated according to Tolbert and Zucker's description of the processes inherent to institutionalization and to Scott's proposed framework of legitimation/legitimizing. These findings suggest that the book primarily legitimizes the BSC by examining organizations and describes it as an acknowledged management instrument. The aspects supporting the semi-institutional stage (26% of the findings and the total institutionalization stage (10% of findings suggest that the authors intended to propose a tool without focusing on the institutionalization process, which may partly explain the great difficulty faced by companies attempting to implement this methodology.

  10. Towards a Theory of Managing Wicked Problems through Multi-Stakeholder Engagements: Evidence from the Agribusiness Sector

    NARCIS (Netherlands)

    Dentoni, D.; Ross, R.

    2013-01-01

    Part Two of our Special Issue on wicked problems in agribusiness, “Towards a Theory of Managing Wicked Problems through Multi-Stakeholder Engagements: Evidence from the Agribusiness Sector,” will contribute to four open questions in the broader fields of management and policy: why, when, which and

  11. Theory of thermoluminescence gamma dose response: The unified interaction model

    Science.gov (United States)

    Horowitz, Y. S.

    2001-09-01

    We describe the development of a comprehensive theory of thermoluminescence (TL) dose response, the unified interaction model (UNIM). The UNIM is based on both radiation absorption stage and recombination stage mechanisms and can describe dose response for heavy charged particles (in the framework of the extended track interaction model - ETIM) as well as for isotropically ionising gamma rays and electrons (in the framework of the TC/LC geminate recombination model) in a unified and self-consistent conceptual and mathematical formalism. A theory of optical absorption dose response is also incorporated in the UNIM to describe the radiation absorption stage. The UNIM is applied to the dose response supralinearity characteristics of LiF:Mg,Ti and is especially and uniquely successful in explaining the ionisation density dependence of the supralinearity of composite peak 5 in TLD-100. The UNIM is demonstrated to be capable of explaining either qualitatively or quantitatively all of the major features of TL dose response with many of the variable parameters of the model strongly constrained by ancilliary optical absorption and sensitisation measurements.

  12. Plane answers to complex questions the theory of linear models

    CERN Document Server

    Christensen, Ronald

    1987-01-01

    This book was written to rigorously illustrate the practical application of the projective approach to linear models. To some, this may seem contradictory. I contend that it is possible to be both rigorous and illustrative and that it is possible to use the projective approach in practical applications. Therefore, unlike many other books on linear models, the use of projections and sub­ spaces does not stop after the general theory. They are used wherever I could figure out how to do it. Solving normal equations and using calculus (outside of maximum likelihood theory) are anathema to me. This is because I do not believe that they contribute to the understanding of linear models. I have similar feelings about the use of side conditions. Such topics are mentioned when appropriate and thenceforward avoided like the plague. On the other side of the coin, I just as strenuously reject teaching linear models with a coordinate free approach. Although Joe Eaton assures me that the issues in complicated problems freq...

  13. Genomic evidence for island population conversion resolves conflicting theories of polar bear evolution.

    Directory of Open Access Journals (Sweden)

    James A Cahill

    Full Text Available Despite extensive genetic analysis, the evolutionary relationship between polar bears (Ursus maritimus and brown bears (U. arctos remains unclear. The two most recent comprehensive reports indicate a recent divergence with little subsequent admixture or a much more ancient divergence followed by extensive admixture. At the center of this controversy are the Alaskan ABC Islands brown bears that show evidence of shared ancestry with polar bears. We present an analysis of genome-wide sequence data for seven polar bears, one ABC Islands brown bear, one mainland Alaskan brown bear, and a black bear (U. americanus, plus recently published datasets from other bears. Surprisingly, we find clear evidence for gene flow from polar bears into ABC Islands brown bears but no evidence of gene flow from brown bears into polar bears. Importantly, while polar bears contributed <1% of the autosomal genome of the ABC Islands brown bear, they contributed 6.5% of the X chromosome. The magnitude of sex-biased polar bear ancestry and the clear direction of gene flow suggest a model wherein the enigmatic ABC Island brown bears are the descendants of a polar bear population that was gradually converted into brown bears via male-dominated brown bear admixture. We present a model that reconciles heretofore conflicting genetic observations. We posit that the enigmatic ABC Islands brown bears derive from a population of polar bears likely stranded by the receding ice at the end of the last glacial period. Since then, male brown bear migration onto the island has gradually converted these bears into an admixed population whose phenotype and genotype are principally brown bear, except at mtDNA and X-linked loci. This process of genome erosion and conversion may be a common outcome when climate change or other forces cause a population to become isolated and then overrun by species with which it can hybridize.

  14. Genomic evidence for island population conversion resolves conflicting theories of polar bear evolution.

    Science.gov (United States)

    Cahill, James A; Green, Richard E; Fulton, Tara L; Stiller, Mathias; Jay, Flora; Ovsyanikov, Nikita; Salamzade, Rauf; St John, John; Stirling, Ian; Slatkin, Montgomery; Shapiro, Beth

    2013-01-01

    Despite extensive genetic analysis, the evolutionary relationship between polar bears (Ursus maritimus) and brown bears (U. arctos) remains unclear. The two most recent comprehensive reports indicate a recent divergence with little subsequent admixture or a much more ancient divergence followed by extensive admixture. At the center of this controversy are the Alaskan ABC Islands brown bears that show evidence of shared ancestry with polar bears. We present an analysis of genome-wide sequence data for seven polar bears, one ABC Islands brown bear, one mainland Alaskan brown bear, and a black bear (U. americanus), plus recently published datasets from other bears. Surprisingly, we find clear evidence for gene flow from polar bears into ABC Islands brown bears but no evidence of gene flow from brown bears into polar bears. Importantly, while polar bears contributed bear, they contributed 6.5% of the X chromosome. The magnitude of sex-biased polar bear ancestry and the clear direction of gene flow suggest a model wherein the enigmatic ABC Island brown bears are the descendants of a polar bear population that was gradually converted into brown bears via male-dominated brown bear admixture. We present a model that reconciles heretofore conflicting genetic observations. We posit that the enigmatic ABC Islands brown bears derive from a population of polar bears likely stranded by the receding ice at the end of the last glacial period. Since then, male brown bear migration onto the island has gradually converted these bears into an admixed population whose phenotype and genotype are principally brown bear, except at mtDNA and X-linked loci. This process of genome erosion and conversion may be a common outcome when climate change or other forces cause a population to become isolated and then overrun by species with which it can hybridize.

  15. Integrating the context-appropriate balanced attention model and reinforcement sensitivity theory: Towards a domain-general personality process model.

    Science.gov (United States)

    Collins, Michael D; Jackson, Chris J; Walker, Benjamin R; O'Connor, Peter J; Gardiner, Elliroma

    2017-01-01

    Over the last 40 years or more the personality literature has been dominated by trait models based on the Big Five (B5). Trait-based models describe personality at the between-person level but cannot explain the within-person mental mechanisms responsible for personality. Nor can they adequately account for variations in emotion and behavior experienced by individuals across different situations and over time. An alternative, yet understated, approach to personality architecture can be found in neurobiological theories of personality, most notably reinforcement sensitivity theory (RST). In contrast to static trait-based personality models like the B5, RST provides a more plausible basis for a personality process model, namely, one that explains how emotions and behavior arise from the dynamic interaction between contextual factors and within-person mental mechanisms. In this article, the authors review the evolution of a neurobiologically based personality process model based on RST, the response modulation model and the context-appropriate balanced attention model. They argue that by integrating this complex literature, and by incorporating evidence from personality neuroscience, one can meaningfully explain personality at both the within- and between-person levels. This approach achieves a domain-general architecture based on RST and self-regulation that can be used to align within-person mental mechanisms, neurobiological systems and between-person measurement models. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Bridging Research, Practice, and Policy: The "Evidence Academy" Conference Model.

    Science.gov (United States)

    Rohweder, Catherine L; Laping, Jane L; Diehl, Sandra J; Moore, Alexis A; Isler, Malika Roman; Scott, Jennifer Elissa; Enga, Zoe Kaori; Black, Molly C; Dave, Gaurav; Corbie-Smith, Giselle; Melvin, Cathy L

    2016-01-01

    Innovative models to facilitate more rapid uptake of research findings into practice are urgently needed. Community members who engage in research can accelerate this process by acting as adoption agents. We implemented an Evidence Academy conference model bringing together researchers, health care professionals, advocates, and policy makers across North Carolina to discuss high-impact, life-saving study results. The overall goal is to develop dissemination and implementation strategies for translating evidence into practice and policy. Each 1-day, single-theme, regional meeting focuses on a leading community-identified health priority. The model capitalizes on the power of diverse local networks to encourage broad, common awareness of new research findings. Furthermore, it emphasizes critical reflection and active group discussion on how to incorporate new evidence within and across organizations, health care systems, and communities. During the concluding session, participants are asked to articulate action plans relevant to their individual interests, work setting, or area of expertise.

  17. A Community-Based Participatory Research Guided Model for the Dissemination of Evidence-Based Interventions.

    Science.gov (United States)

    Delafield, Rebecca; Hermosura, Andrea Nacapoy; Ing, Claire Townsend; Hughes, Claire K; Palakiko, Donna-Marie; Dillard, Adrienne; Kekauoha, B Puni; Yoshimura, Sheryl R; Gamiao, Shari; Kaholokula, Joseph Keawe

    Dissemination is a principle within community-based participatory research (CBPR); however, published research focuses on the dissemination of findings from CBPR projects but less on dissemination of interventions developed through CBPR approaches. To disseminate an evidence-based lifestyle intervention tailored for Native Hawaiians and other Pacific Islanders, the PILI 'Ohana Project (POP), an 11-year CBPR initiative, developed an innovative dissemination model. The community-to-community mentoring (CCM) model described in this paper extends the application of CBPR values and principles used in intervention development to intervention dissemination. The CCM model combines a CBPR orientation with the diffusion of innovation theory, the social cognitive theory, and key concepts from community organizing and community building to address the multilevel factors that influence uptake of an evidence-based intervention (EBI). Grounding the model in CBPR principles provides benefits for intervention dissemination and integrates a focus on community benefits and capacity building. By establishing co-equal, mutually beneficial relationships at the core of the CCM model, opportunities are created for building critical consciousness, community capacity, and social capital. More research is needed to determine the effectiveness of this model of intervention dissemination which may enhance diffusion of CBPR interventions and empower communities in the process.

  18. AIC, BIC, Bayesian evidence against the interacting dark energy model

    International Nuclear Information System (INIS)

    Szydlowski, Marek; Krawiec, Adam; Kurek, Aleksandra; Kamionka, Michal

    2015-01-01

    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative - the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock- Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam's razor we are inclined to reject this model. (orig.)

  19. An Ensemble Deep Convolutional Neural Network Model with Improved D-S Evidence Fusion for Bearing Fault Diagnosis.

    Science.gov (United States)

    Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang; Hu, Jianjun

    2017-07-28

    Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster-Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions.

  20. Dynamic statistical models of biological cognition: insights from communications theory

    Science.gov (United States)

    Wallace, Rodrick

    2014-10-01

    Maturana's cognitive perspective on the living state, Dretske's insight on how information theory constrains cognition, the Atlan/Cohen cognitive paradigm, and models of intelligence without representation, permit construction of a spectrum of dynamic necessary conditions statistical models of signal transduction, regulation, and metabolism at and across the many scales and levels of organisation of an organism and its context. Nonequilibrium critical phenomena analogous to physical phase transitions, driven by crosstalk, will be ubiquitous, representing not only signal switching, but the recruitment of underlying cognitive modules into tunable dynamic coalitions that address changing patterns of need and opportunity at all scales and levels of organisation. The models proposed here, while certainly providing much conceptual insight, should be most useful in the analysis of empirical data, much as are fitted regression equations.

  1. A queueing theory based model for business continuity in hospitals.

    Science.gov (United States)

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

    2013-01-01

    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.

  2. Density Functional Theory and Materials Modeling at Atomistic Length Scales

    Directory of Open Access Journals (Sweden)

    Swapan K. Ghosh

    2002-04-01

    Full Text Available Abstract: We discuss the basic concepts of density functional theory (DFT as applied to materials modeling in the microscopic, mesoscopic and macroscopic length scales. The picture that emerges is that of a single unified framework for the study of both quantum and classical systems. While for quantum DFT, the central equation is a one-particle Schrodinger-like Kohn-Sham equation, the classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential, the exact functional form of which is unknown. One therefore approximates the exchange-correlation potential for quantum systems and the excess free energy density functional or the direct correlation functions for classical systems. Illustrative applications of quantum DFT to microscopic modeling of molecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are highlighted.

  3. Lattice Gauge Theories Within and Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Gelzer, Zechariah John [Iowa U.

    2017-01-01

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \

  4. Remarks on “A new non-specificity measure in evidence theory based on belief intervals”

    Directory of Open Access Journals (Sweden)

    Joaquín ABELLÁN

    2018-03-01

    Full Text Available Two types of uncertainty co-exist in the theory of evidence: discord and non-specificity. From 90s, many mathematical expressions have arisen to quantify these two parts in an evidence. An important aspect of each measure presented is the verification of a coherent set of properties. About non-specificity, so far only one measure verifies an important set of those properties. Very recently, a new measure of non-specificity based on belief intervals has been presented as an alternative measure that quantifies a similar set of properties (Yang et al., 2016. It is shown that the new measure really does not verify two of those important properties. Some errors have been found in their corresponding proofs in the original publication. Keywords: Additivity, Imprecise probabilities, Non-specificity, Subadditivity, Theory of evidence, Uncertainty measures

  5. Criticism of the Classical Theory of Macroeconomic Modeling

    Directory of Open Access Journals (Sweden)

    Konstantin K. Kumehov

    2015-01-01

    Full Text Available Abstract: Current approaches and methods of modeling of macroeconomic systems do not allow to generate research ideas that could be used in applications. This is largely due to the fact that the dominant economic schools and research directions are building their theories on misconceptions about the economic system as object modeling, and have no common methodological approaches in the design of macroeconomic models. All of them are focused on building a model aimed at establishing equilibrium parameters of supply and demand, production and consumption. At the same time as the underlying factors are not considered resource potential and the needs of society in material and other benefits. In addition, there is no unity in the choice of elements and mechanisms of interaction between them. Not installed, what are the criteria to determine the elements of the model: whether it is the institutions, whether the industry is whether the population, or banks, or classes, etc. From the methodological point of view, the design of the model all the most well-known authors extrapolated to the new models of the past state or past events. As a result, every time the model is ready by the time the situation changes, the last parameters underlying the model are losing relevance, so at best, the researcher may have to interpret the events and parameters that are not feasible in the future. In this paper, based on analysis of the works of famous authors, belonging to different schools and areas revealed weaknesses of their proposed macroeconomic models that do not allow you to use them to solve applied problems of economic development. A fundamentally new approaches and methods by which it is possible the construction of macroeconomic models that take into account the theoretical and applied aspects of modeling, as well as formulated the basic methodological requirements.

  6. Phase-field theories for mathematical modeling of biological membranes.

    Science.gov (United States)

    Lázaro, Guillermo R; Pagonabarraga, Ignacio; Hernández-Machado, Aurora

    2015-01-01

    Biological membranes are complex structures whose mechanics are usually described at a mesoscopic level, such as the Helfrich bending theory. In this article, we present the phase-field methods, a useful tool for studying complex membrane problems which can be applied to very different phenomena. We start with an overview of the general theory of elasticity, paying special attention to its derivation from a molecular scale. We then study the particular case of membrane elasticity, explicitly obtaining the Helfrich bending energy. Within the framework of this theory, we derive a phase-field model for biological membranes and explore its physical basis and interpretation in terms of membrane elasticity. We finally explain three examples of applications of these methods to membrane related problems. First, the case of vesicle pearling and tubulation, when lipidic vesicles are exposed to the presence of hydrophobic polymers that anchor to the membrane, inducing a shape instability. Finally, we study the behavior of red blood cells while flowing in narrow microchannels, focusing on the importance of membrane elasticity to the cell flow capabilities. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Assembly models for Papovaviridae based on tiling theory.

    Science.gov (United States)

    Keef, T; Taormina, A; Twarock, R

    2005-09-13

    A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail.

  8. Multiagent model and mean field theory of complex auction dynamics

    Science.gov (United States)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng

    2015-09-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  9. Mean-field theory and self-consistent dynamo modeling

    International Nuclear Information System (INIS)

    Yoshizawa, Akira; Yokoi, Nobumitsu

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  10. Dimorphism by Singularity Theory in a Model for River Ecology.

    Science.gov (United States)

    Golubitsky, Martin; Hao, Wenrui; Lam, King-Yeung; Lou, Yuan

    2017-05-01

    Geritz, Gyllenberg, Jacobs, and Parvinen show that two similar species can coexist only if their strategies are in a sector of parameter space near a nondegenerate evolutionarily singular strategy. We show that the dimorphism region can be more general by using the unfolding theory of Wang and Golubitsky near a degenerate evolutionarily singular strategy. Specifically, we use a PDE model of river species as an example of this approach. Our finding shows that the dimorphism region can exhibit various different forms that are strikingly different from previously known results in adaptive dynamics.

  11. Morphing the Shell Model into an Effective Theory

    International Nuclear Information System (INIS)

    Haxton, W. C.; Song, C.-L.

    2000-01-01

    We describe a strategy for attacking the canonical nuclear structure problem--bound-state properties of a system of point nucleons interacting via a two-body potential--which involves an expansion in the number of particles scattering at high momenta, but is otherwise exact. The required self-consistent solutions of the Bloch-Horowitz equation for effective interactions and operators are obtained by an efficient Green's function method based on the Lanczos algorithm. We carry out this program for the simplest nuclei, d and 3 He , in order to explore the consequences of reformulating the shell model as a controlled effective theory. (c) 2000 The American Physical Society

  12. Theory and Modeling of High-Power Gyrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Nusinovich, Gregory Semeon [Univ. of Maryland, College Park, MD (United States)

    2016-04-29

    This report summarized results of the work performed at the Institute for Research in Electronics and Applied Physics of the University of Maryland (College Park, MD) in the framework of the DOE Grant “Theory and Modeling of High-Power Gyrotrons”. The report covers the work performed in 2011-2014. The research work was performed in three directions: - possibilities of stable gyrotron operation in very high-order modes offering the output power exceeding 1 MW level in long-pulse/continuous-wave regimes, - effect of small imperfections in gyrotron fabrication and alignment on the gyrotron efficiency and operation, - some issues in physics of beam-wave interaction in gyrotrons.

  13. Random matrix theory and higher genus integrability: the quantum chiral Potts model

    International Nuclear Information System (INIS)

    Angles d'Auriac, J.Ch.; Maillard, J.M.; Viallet, C.M.

    2002-01-01

    We perform a random matrix theory (RMT) analysis of the quantum four-state chiral Potts chain for different sizes of the chain up to size L 8. Our analysis gives clear evidence of a Gaussian orthogonal ensemble (GOE) statistics, suggesting the existence of a generalized time-reversal invariance. Furthermore, a change from the (generic) GOE distribution to a Poisson distribution occurs when the integrability conditions are met. The chiral Potts model is known to correspond to a (star-triangle) integrability associated with curves of genus higher than zero or one. Therefore, the RMT analysis can also be seen as a detector of 'higher genus integrability'. (author)

  14. New Trends in Model Coupling Theory, Numerics and Applications

    International Nuclear Information System (INIS)

    Coquel, F.; Godlewski, E.; Herard, J. M.; Segre, J.

    2010-01-01

    This special issue comprises selected papers from the workshop New Trends in Model Coupling, Theory, Numerics and Applications (NTMC'09) which took place in Paris, September 2 - 4, 2009. The research of optimal technological solutions in a large amount of industrial systems requires to perform numerical simulations of complex phenomena which are often characterized by the coupling of models related to various space and/or time scales. Thus, the so-called multi-scale modelling has been a thriving scientific activity which connects applied mathematics and other disciplines such as physics, chemistry, biology or even social sciences. To illustrate the variety of fields concerned by the natural occurrence of model coupling we may quote: meteorology where it is required to take into account several turbulence scales or the interaction between oceans and atmosphere, but also regional models in a global description, solid mechanics where a thorough understanding of complex phenomena such as propagation of cracks needs to couple various models from the atomistic level to the macroscopic level; plasma physics for fusion energy for instance where dense plasmas and collisionless plasma coexist; multiphase fluid dynamics when several types of flow corresponding to several types of models are present simultaneously in complex circuits; social behaviour analysis with interaction between individual actions and collective behaviour. (authors)

  15. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Science.gov (United States)

    Oizumi, Ryo; Kuniya, Toshikazu; Enatsu, Yoichi

    2016-01-01

    Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  16. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Directory of Open Access Journals (Sweden)

    Ryo Oizumi

    Full Text Available Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  17. Roy Adaptation Model: integrative review of studies conducted in the light of the theory

    Directory of Open Access Journals (Sweden)

    Lays Pinheiro de Medeiros

    2015-04-01

    Full Text Available Objective: to identify the scientific evidence about the components of the Roy Adaptation Model in the population studied in the light of this theory. Methods: this is an integrative literature review in databases of the Latin-American and Caribbean Center on Health Sciences Information, Medical Literature Analysis and Retrieval System Online, Spanish Bibliographic Index on Health Sciences, Nursing Database, PubMed Central, Cumulative Index to Nursing and Allied Health Literature, Web of Science, and SciVerse Scopus. The sample consists of 20 articles published between 2005 and 2013. Results: the three types of stimuli, 38 of 82 adaptive problems, the four adaptive modes, and the six steps of the nursing process were identified. Conclusion: there is need for further studies on this theory and that address the entire nursing process, culminating in the increase in specific nursing knowledge and affirmation of this science in health.

  18. Flipped classroom model for learning evidence-based medicine

    Directory of Open Access Journals (Sweden)

    Rucker SY

    2017-08-01

    Full Text Available Sydney Y Rucker,1 Zulfukar Ozdogan,1 Morhaf Al Achkar2 1School of Education, Indiana University, Bloomington, IN, 2Department of Family Medicine, School of Medicine, University of Washington, Seattle, WA, USA Abstract: Journal club (JC, as a pedagogical strategy, has long been used in graduate medical education (GME. As evidence-based medicine (EBM becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice. Keywords: evidence-based medicine, flipped classroom, residency education

  19. Attachment and the processing of social information across the life span: theory and evidence.

    Science.gov (United States)

    Dykas, Matthew J; Cassidy, Jude

    2011-01-01

    Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the patterns of results that have emerged from these studies. The central proposition is that individuals who possess secure experience-based internal working models of attachment will process--in a relatively open manner--a broad range of positive and negative attachment-relevant social information. Moreover, secure individuals will draw on their positive attachment-related knowledge to process this information in a positively biased schematic way. In contrast, individuals who possess insecure internal working models of attachment will process attachment-relevant social information in one of two ways, depending on whether the information could cause the individual psychological pain. If processing the information is likely to lead to psychological pain, insecure individuals will defensively exclude this information from further processing. If, however, the information is unlikely to lead to psychological pain, then insecure individuals will process this information in a negatively biased schematic fashion that is congruent with their negative attachment-related experiences. In a comprehensive literature review, we describe studies that illustrate these patterns of attachment-related information processing from childhood to adulthood. This review focuses on studies that have examined specific components (e.g., attention and memory) and broader aspects (e.g., attributions) of social information processing. We also provide general conclusions and suggestions for future research.

  20. Evidence for an inhibitory-control theory of the reasoning brain.

    Science.gov (United States)

    Houdé, Olivier; Borst, Grégoire

    2015-01-01

    In this article, we first describe our general inhibitory-control theory and, then, we describe how we have tested its specific hypotheses on reasoning with brain imaging techniques in adults and children. The innovative part of this perspective lies in its attempt to come up with a brain-based synthesis of Jean Piaget's theory on logical algorithms and Daniel Kahneman's theory on intuitive heuristics.

  1. Evidence for an inhibitory-control theory of the reasoning brain

    Directory of Open Access Journals (Sweden)

    Olivier eHoudé

    2015-03-01

    Full Text Available In this article, we first describe our general inhibitory-control theory and, then, we describe how we have tested its specific hypotheses on reasoning with brain imaging techniques in adults and children. The innovative part of this perspective lies in its attempt to come up with a brain-based synthesis of Jean Piaget’s theory on logical algorithms and Daniel Kahneman’s theory on intuitive heuristics.

  2. PREFACE: Theory, Modelling and Computational methods for Semiconductors

    Science.gov (United States)

    Migliorato, Max; Probert, Matt

    2010-04-01

    These conference proceedings contain the written papers of the contributions presented at the 2nd International Conference on: Theory, Modelling and Computational methods for Semiconductors. The conference was held at the St Williams College, York, UK on 13th-15th Jan 2010. The previous conference in this series took place in 2008 at the University of Manchester, UK. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in Semiconductor science and technology, where there is a substantial potential for time saving in R&D. The development of high speed computer architectures is finally allowing the routine use of accurate methods for calculating the structural, thermodynamic, vibrational and electronic properties of semiconductors and their heterostructures. This workshop ran for three days, with the objective of bringing together UK and international leading experts in the field of theory of group IV, III-V and II-VI semiconductors together with postdocs and students in the early stages of their careers. The first day focused on providing an introduction and overview of this vast field, aimed particularly at students at this influential point in their careers. We would like to thank all participants for their contribution to the conference programme and these proceedings. We would also like to acknowledge the financial support from the Institute of Physics (Computational Physics group and Semiconductor Physics group), the UK Car-Parrinello Consortium, Accelrys (distributors of Materials Studio) and Quantumwise (distributors of Atomistix). The Editors Acknowledgements Conference Organising Committee: Dr Matt Probert (University of York) and Dr Max Migliorato (University of Manchester) Programme Committee: Dr Marco Califano (University of Leeds), Dr Jacob Gavartin (Accelrys Ltd, Cambridge), Dr Stanko Tomic (STFC Daresbury Laboratory), Dr Gabi Slavcheva (Imperial College London) Proceedings edited and compiled by Dr

  3. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice.

    Science.gov (United States)

    Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane

    2016-02-01

    To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and

  4. Theory and Modeling for the Magnetospheric Multiscale Mission

    Science.gov (United States)

    Hesse, M.; Aunai, N.; Birn, J.; Cassak, P.; Denton, R. E.; Drake, J. F.; Gombosi, T.; Hoshino, M.; Matthaeus, W.; Sibeck, D.; Zenitani, S.

    2016-03-01

    The Magnetospheric Multiscale (MMS) mission will provide measurement capabilities, which will exceed those of earlier and even contemporary missions by orders of magnitude. MMS will, for the first time, be able to measure directly and with sufficient resolution key features of the magnetic reconnection process, down to the critical electron scales, which need to be resolved to understand how reconnection works. Owing to the complexity and extremely high spatial resolution required, no prior measurements exist, which could be employed to guide the definition of measurement requirements, and consequently set essential parameters for mission planning and execution. Insight into expected details of the reconnection process could hence only been obtained from theory and modern kinetic modeling. This situation was recognized early on by MMS leadership, which supported the formation of a fully integrated Theory and Modeling Team (TMT). The TMT participated in all aspects of mission planning, from the proposal stage to individual aspects of instrument performance characteristics. It provided and continues to provide to the mission the latest insights regarding the kinetic physics of magnetic reconnection, as well as associated particle acceleration and turbulence, assuring that, to the best of modern knowledge, the mission is prepared to resolve the inner workings of the magnetic reconnection process. The present paper provides a summary of key recent results or reconnection research by TMT members.

  5. Tissue Acoustoelectric Effect Modeling From Solid Mechanics Theory.

    Science.gov (United States)

    Song, Xizi; Qin, Yexian; Xu, Yanbin; Ingram, Pier; Witte, Russell S; Dong, Feng

    2017-10-01

    The acoustoelectric (AE) effect is a basic physical phenomenon, which underlies the changes made in the conductivity of a medium by the application of focused ultrasound. Recently, based on the AE effect, several biomedical imaging techniques have been widely studied, such as ultrasound-modulated electrical impedance tomography and ultrasound current source density imaging. To further investigate the mechanism of the AE effect in tissue and to provide guidance for such techniques, we have modeled the tissue AE effect using the theory of solid mechanics. Both bulk compression and thermal expansion of tissue are considered and discussed. Computation simulation shows that the muscle AE effect result, conductivity change rate, is 3.26×10 -3 with 4.3-MPa peak pressure, satisfying the theoretical value. Bulk compression plays the main role for muscle AE effect, while thermal expansion makes almost no contribution to it. In addition, the AE signals of porcine muscle are measured at different focal positions. With the same magnitude order and the same change trend, the experiment result confirms that the simulation result is effective. Both simulation and experimental results validate that tissue AE effect modeling using solid mechanics theory is feasible, which is of significance for the further development of related biomedical imaging techniques.

  6. Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.

    Science.gov (United States)

    Merrick, Jason R W; Leclerc, Philip

    2016-04-01

    Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. © 2014 Society for Risk Analysis.

  7. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models

    Science.gov (United States)

    Funnell, Sue C.; Rogers, Patricia J.

    2011-01-01

    Between good intentions and great results lies a program theory--not just a list of tasks but a vision of what needs to happen, and how. Now widely used in government and not-for-profit organizations, program theory provides a coherent picture of how change occurs and how to improve performance. "Purposeful Program Theory" shows how to develop,…

  8. Evidence and Formal Models in the Linguistic Sciences

    Science.gov (United States)

    Santana, Carlos Gray

    2016-01-01

    This dissertation contains a collection of essays centered on the relationship between theoretical model-building and empirical evidence-gathering in linguistics and related language sciences. The first chapter sets the stage by demonstrating that the subject matter of linguistics is manifold, and contending that discussion of relationships…

  9. Building SO(10) models from F-theory

    CERN Document Server

    Antoniadis, I

    2012-01-01

    We revisit local F-theory SO(10) and SU(5) GUTs and analyze their properties within the framework of the maximal underlying E_8 symmetry in the elliptic fibration. We consider the symmetry enhancements along the intersections of seven-branes with the GUT surface and study in detail the embedding of the abelian factors undergoing monodromies in the covering gauge groups. We combine flux data from the successive breaking of SO(10) to SU(5) gauge symmetry and subsequently to the Standard Model one, and further constrain the parameters determining the models' particle spectra. In order to eliminate dangerous baryon number violating operators we propose ways to construct matter parity like symmetries from intrinsic geometric origin. We study implementations of the resulting constrained scenario in specific examples obtained for a variety of monodromies.

  10. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    Cahill R. T.

    2011-01-01

    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g-anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  11. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    May R. D.

    2011-01-01

    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  12. Magnetized cosmological models in bimetric theory of gravitation

    Indian Academy of Sciences (India)

    Bimetric theory; perfect fluid; cosmic string; magnetic field; Bianchi type-III. PACS Nos 04.20.-q; 04.20.Cv; 04.20.Ex; 98.90. 1. Introduction. A new theory of gravitation, called the bimetric theory of gravitation, was proposed by Rosen [1] to modify the Einstein's general theory of relativity by assuming two metric tensors, viz., a ...

  13. Algebraic structure of cohomological field theory models and equivariant cohomology

    International Nuclear Information System (INIS)

    Stora, R.; Thuillier, F.; Wallet, J.Ch.

    1994-01-01

    The definition of observables within conventional gauge theories is settled by general consensus. Within cohomological theories considered as gauge theories of an exotic type, that question has a much less obvious answer. It is shown here that in most cases these theories are best defined in terms of equivariant cohomologies both at the field level and at the level of observables. (author). 21 refs

  14. Extreme value theory in emerging markets: Evidence from the Montenegrin stock exchange

    Directory of Open Access Journals (Sweden)

    Cerović Julija

    2015-01-01

    Full Text Available The concept of Value at Risk(VaR estimates the maximum loss of a financial position at a given time for a given probability. This paper considers the adequacy of the methods that are the basis of extreme value theory in the Montenegrin emerging market before and during the global financial crisis. In particular, the purpose of the paper is to investigate whether the peaks-over-threshold method outperforms the block maxima method in evaluation of Value at Risk in emerging stock markets such as the Montenegrin market. The daily return of the Montenegrin stock market index MONEX20 is analyzed for the period January 2004 - February 2014. Results of the Kupiec test show that the peaks-over-threshold method is significantly better than the block maxima method, but both methods fail to pass the Christoffersen independence test and joint test due to the lack of accuracy in exception clustering when measuring Value at Risk. Although better, the peaks-over-threshold method still cannot be treated as an accurate VaR model for the Montenegrin frontier stock market.

  15. Cognitive and affective components of Theory of Mind in preschoolers with oppositional defiance disorder: Clinical evidence.

    Science.gov (United States)

    de la Osa, Nuria; Granero, Roser; Domenech, Josep Maria; Shamay-Tsoory, Simone; Ezpeleta, Lourdes

    2016-07-30

    The goal of the study was to examine the affective-cognitive components of Theory of Mind (ToM), in a community sample of 538 preschoolers, and more specifically in a subsample of 40 children diagnosed with ODD. The relationship between affective and cognitive ToM and some ODD clinical characteristics was examined. Children were assessed with structured diagnostic interviews and dimensional measures of psychopathology, impairment and unemotional traits. A measure based on eye-gaze was used to assess ToM. Mixed analysis of variance compared the mean cognitive versus affective scale scores and the between-subjects factor ODD. The association between ToM-scores and clinical measures was assessed through correlation models. Execution and reaction time to emotional and cognitive components of ToM tasks are different at age 5 in normally developing children. Oppositional Defiant children had slower response time when performing the affective mentalizing condition than children without the disorder. The correlation matrix between ToM-scores and clinical measures showed specific associations depending on the impaired ToM aspect and the psychological domain. Results may have clinical implications for the prevention and management of ODD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Why Universities Join Cross-Sector Social Partnerships: Theory and Evidence

    Science.gov (United States)

    Siegel, David J.

    2010-01-01

    Cross-sector partnerships are an increasingly popular mode of organizing to address intractable social problems, yet theory and research have virtually ignored university involvement in such activity. This article attempts to ascertain the reasons universities join networks of other social actors to support a common cause. Theories on the…

  17. Educational Mismatches and Earnings: Extensions of Occupational Mobility Theory and Evidence of Human Capital Depreciation

    Science.gov (United States)

    Rubb, Stephen

    2006-01-01

    Using a human capital theory framework, this study examines the impact of educational mismatches on earnings and occupational mobility. Occupational mobility theory suggests that overeducated workers observe greater upward occupational mobility and undereducated workers observe lower upward occupational mobility. By extension, this leads to…

  18. How to use the Standard Model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Henning, Brian; Lu, Xiaochuan [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Murayama, Hitoshi [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Kavli Institute for the Physics and Mathematics of the Universe (WPI),Todai Institutes for Advanced Study, University of Tokyo,Kashiwa 277-8583 (Japan)

    2016-01-05

    We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.

  19. How to use the Standard Model effective field theory

    Science.gov (United States)

    Henning, Brian; Lu, Xiaochuan; Murayama, Hitoshi

    2016-01-01

    We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.

  20. Theory and modeling of cylindrical thermo-acoustic transduction

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Lihong, E-mail: lhtong@ecjtu.edu.cn [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China); Lim, C.W. [Department of Architecture and Civil Engineering, City University of Hong Kong, Kowloon, Hong Kong SAR (China); Zhao, Xiushao; Geng, Daxing [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China)

    2016-06-03

    Models both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed and the corresponding acoustic pressure solutions are obtained. The acoustic pressure for an individual carbon nanotube (CNT) as a function of input power is investigated analytically and it is verified by comparing with the published experimental data. Further numerical analysis on the acoustic pressure response and characteristics for varying input frequency and distance are also examined both for solid and thinfilm-solid cylindrical thermo-acoustic transductions. Through detailed theoretical and numerical studies on the acoustic pressure solution for thinfilm-solid cylindrical transduction, it is concluded that a solid with smaller thermal conductivity favors to improve the acoustic performance. In general, the proposed models are applicable to a variety of cylindrical thermo-acoustic devices performing in different gaseous media. - Highlights: • Theory and modeling both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed. • The modeling is verified by comparing with the published experimental data. • Acoustic response characteristics of cylindrical thermo-acoustic transductions are predicted by the proposed model.

  1. Theory of mind in Alzheimer disease: Evidence of authentic impairment during social interaction.

    Science.gov (United States)

    Moreau, Noémie; Rauzy, Stéphane; Viallet, François; Champagne-Lavau, Maud

    2016-03-01

    The present study aimed to investigate theory of mind (the ability to infer others' mental states) deficit in 20 patients with mild Alzheimer's disease and 20 healthy controls, with 2 theory of mind tasks, 1 of them being a real interactive task. Previous results concerning preserved or altered theory of mind abilities in Alzheimer's disease have been inconsistent and relationships with other cognitive dysfunctions (notably episodic memory and executive functions) are still unclear. The first task we used was a false belief paradigm as frequently used in literature whereas the second task, a referential communication task, assessed theory of mind in a real situation of interaction. Participants also underwent neuropsychological evaluation to investigate potential relationships between theory of mind and memory deficits. The results showed that Alzheimer patients presented a genuine and significant theory of mind deficit compared to control participants characterized notably by difficulties to attribute knowledge to an interlocutor in a real social interaction. These results further confirm that theory of mind is altered in early stages of Alzheimer dementia which is consistent with previous works. More specifically, this study is the first to objectivize this impairment in social interaction. (c) 2016 APA, all rights reserved).

  2. Models and theories of prescribing decisions: A review and suggested a new model.

    Science.gov (United States)

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  3. A novel wide-area backup protection based on fault component current distribution and improved evidence theory.

    Science.gov (United States)

    Zhang, Zhe; Kong, Xiangping; Yin, Xianggen; Yang, Zengli; Wang, Lijun

    2014-01-01

    In order to solve the problems of the existing wide-area backup protection (WABP) algorithms, the paper proposes a novel WABP algorithm based on the distribution characteristics of fault component current and improved Dempster/Shafer (D-S) evidence theory. When a fault occurs, slave substations transmit to master station the amplitudes of fault component currents of transmission lines which are the closest to fault element. Then master substation identifies suspicious faulty lines according to the distribution characteristics of fault component current. After that, the master substation will identify the actual faulty line with improved D-S evidence theory based on the action states of traditional protections and direction components of these suspicious faulty lines. The simulation examples based on IEEE 10-generator-39-bus system show that the proposed WABP algorithm has an excellent performance. The algorithm has low requirement of sampling synchronization, small wide-area communication flow, and high fault tolerance.

  4. A Novel Wide-Area Backup Protection Based on Fault Component Current Distribution and Improved Evidence Theory

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2014-01-01

    Full Text Available In order to solve the problems of the existing wide-area backup protection (WABP algorithms, the paper proposes a novel WABP algorithm based on the distribution characteristics of fault component current and improved Dempster/Shafer (D-S evidence theory. When a fault occurs, slave substations transmit to master station the amplitudes of fault component currents of transmission lines which are the closest to fault element. Then master substation identifies suspicious faulty lines according to the distribution characteristics of fault component current. After that, the master substation will identify the actual faulty line with improved D-S evidence theory based on the action states of traditional protections and direction components of these suspicious faulty lines. The simulation examples based on IEEE 10-generator-39-bus system show that the proposed WABP algorithm has an excellent performance. The algorithm has low requirement of sampling synchronization, small wide-area communication flow, and high fault tolerance.

  5. A Novel Wide-Area Backup Protection Based on Fault Component Current Distribution and Improved Evidence Theory

    Science.gov (United States)

    Zhang, Zhe; Kong, Xiangping; Yin, Xianggen; Yang, Zengli; Wang, Lijun

    2014-01-01

    In order to solve the problems of the existing wide-area backup protection (WABP) algorithms, the paper proposes a novel WABP algorithm based on the distribution characteristics of fault component current and improved Dempster/Shafer (D-S) evidence theory. When a fault occurs, slave substations transmit to master station the amplitudes of fault component currents of transmission lines which are the closest to fault element. Then master substation identifies suspicious faulty lines according to the distribution characteristics of fault component current. After that, the master substation will identify the actual faulty line with improved D-S evidence theory based on the action states of traditional protections and direction components of these suspicious faulty lines. The simulation examples based on IEEE 10-generator-39-bus system show that the proposed WABP algorithm has an excellent performance. The algorithm has low requirement of sampling synchronization, small wide-area communication flow, and high fault tolerance. PMID:25050399

  6. Unifying model for random matrix theory in arbitrary space dimensions

    Science.gov (United States)

    Cicuta, Giovanni M.; Krausser, Johannes; Milkus, Rico; Zaccone, Alessio

    2018-03-01

    A sparse random block matrix model suggested by the Hessian matrix used in the study of elastic vibrational modes of amorphous solids is presented and analyzed. By evaluating some moments, benchmarked against numerics, differences in the eigenvalue spectrum of this model in different limits of space dimension d , and for arbitrary values of the lattice coordination number Z , are shown and discussed. As a function of these two parameters (and their ratio Z /d ), the most studied models in random matrix theory (Erdos-Renyi graphs, effective medium, and replicas) can be reproduced in the various limits of block dimensionality d . Remarkably, the Marchenko-Pastur spectral density (which is recovered by replica calculations for the Laplacian matrix) is reproduced exactly in the limit of infinite size of the blocks, or d →∞ , which clarifies the physical meaning of space dimension in these models. We feel that the approximate results for d =3 provided by our method may have many potential applications in the future, from the vibrational spectrum of glasses and elastic networks to wave localization, disordered conductors, random resistor networks, and random walks.

  7. Stakeholder Theory and Value Creation Models in Brazilian Firms

    Directory of Open Access Journals (Sweden)

    Natalia Giugni Vidal

    2015-09-01

    Full Text Available Objective – The purpose of this study is to understand how top Brazilian firms think about and communicate value creation to their stakeholders. Design/methodology/approach – We use qualitative content analysis methodology to analyze the sustainability or annual integrated reports of the top 25 Brazilian firms by sales revenue. Findings – Based on our analysis, these firms were classified into three main types of stakeholder value creation models: narrow, broad, or transitioning from narrow to broad. We find that many of the firms in our sample are in a transition state between narrow and broad stakeholder value creation models. We also identify seven areas of concentration discussed by firms in creating value for stakeholders: better stakeholder relationships, better work environment, environmental preservation, increased customer base, local development, reputation, and stakeholder dialogue. Practical implications – This study shows a trend towards broader stakeholder value creation models in Brazilian firms. The findings of this study may inform practitioners interested in broadening their value creation models. Originality/value – This study adds to the discussion of stakeholder theory in the Brazilian context by understanding variations in value creation orientation in Brazil.

  8. Lorentz Violation of the Photon Sector in Field Theory Models

    Directory of Open Access Journals (Sweden)

    Lingli Zhou

    2014-01-01

    Full Text Available We compare the Lorentz violation terms of the pure photon sector between two field theory models, namely, the minimal standard model extension (SME and the standard model supplement (SMS. From the requirement of the identity of the intersection for the two models, we find that the free photon sector of the SMS can be a subset of the photon sector of the minimal SME. We not only obtain some relations between the SME parameters but also get some constraints on the SMS parameters from the SME parameters. The CPT-odd coefficients (kAFα of the SME are predicted to be zero. There are 15 degrees of freedom in the Lorentz violation matrix Δαβ of free photons of the SMS related with the same number of degrees of freedom in the tensor coefficients (kFαβμν, which are independent from each other in the minimal SME but are interrelated in the intersection of the SMS and the minimal SME. With the related degrees of freedom, we obtain the conservative constraints (2σ on the elements of the photon Lorentz violation matrix. The detailed structure of the photon Lorentz violation matrix suggests some applications to the Lorentz violation experiments for photons.

  9. Game Theory for Speculative Derivatives: A Possible Stabilizing Regulatory Model

    Directory of Open Access Journals (Sweden)

    Francesco Musolino

    2012-10-01

    Full Text Available The aim of this paper is to propose a methodology to stabilize the financial markets using Game Theory, specifically the Complete Study of a Differentiable Game. Initially, we intend to make a quick discussion of peculiarities and recent development of derivatives, and then we move on to the main topic of the paper: forwards and futures. We illustrate their pricing and the functioning of markets for this particular derivatives type. We also will examine the short or long hedging strategies, used by companies to try to cancel the risk associated with market variables. At this purpose, we present a game theory model. Specifically, we focus on two economic operators: a real economic subject and a financial institute (a bank, for example with a big economic availability. For this purpose, we discuss about an interaction between the two above economic subjects: the Enterprise, our first player, and the Financial Institute, our second player. We propose a tax on financial transactions with speculative purposes in order to stabilize the financial market, protecting it from speculations. This tax hits only the speculative profits and we find a cooperative solution that allows, however, both players to obtain a gain.

  10. A model-theory for tachyons in two dimensions

    International Nuclear Information System (INIS)

    Recami, E.; Rodrigues Junior, W.A.

    1985-01-01

    The subject of Tachyons, even if still speculative, may deserve some attention for reasons that can be divided into a few categories, two of which are preliminarily mentioned right now; (i) the larger scheme that one tries to build up in order to incorporate space-like objects in the relativistic theories can allow a better understanding of many aspects of the ordinary relativistic physics, even if Tachyons would not exist in our cosmos as 'asymptotically free' objects; (ii) Superluminal classical objects can have a role in elementary particle interactions (and perhaps even in astrophysics); and it might be tempting to verify how far one can go in reproducing the quantum-like behaviour at a classical level just by taking account of the possible existence of faster-than-light classical particles. This article is divided in two parts, the first one having nothing to do with tachyons. In fact, to prepare the ground, in Part I (Sect. 2) it is merely shown that Special Relativity - even without tachyons - can be given a form such to describe both particles and anti-particles. The plan of Part II is confined only to a 'model-theory' of Tachyons in two dimensions, for the reasons stated in Sect. 3. (Author) [pt

  11. Chemical theory and modelling through density across length scales

    International Nuclear Information System (INIS)

    Ghosh, Swapan K.

    2016-01-01

    One of the concepts that has played a major role in the conceptual as well as computational developments covering all the length scales of interest in a number of areas of chemistry, physics, chemical engineering and materials science is the concept of single-particle density. Density functional theory has been a versatile tool for the description of many-particle systems across length scales. Thus, in the microscopic length scale, an electron density based description has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids. Density concept has been used in the form of single particle number density in the intermediate mesoscopic length scale to obtain an appropriate picture of the equilibrium and dynamical processes, dealing with a wide class of problems involving interfacial science and soft condensed matter. In the macroscopic length scale, however, matter is usually treated as a continuous medium and a description using local mass density, energy density and other related property density functions has been found to be quite appropriate. The basic ideas underlying the versatile uses of the concept of density in the theory and modelling of materials and phenomena, as visualized across length scales, along with selected illustrative applications to some recent areas of research on hydrogen energy, soft matter, nucleation phenomena, isotope separation, and separation of mixture in condensed phase, will form the subject matter of the talk. (author)

  12. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection

    Science.gov (United States)

    Reason, Robert D.; Kimball, Ezekiel W.

    2012-01-01

    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  13. Uniting model theory and the universalist tradition of logic: Carnap's early axiomatics

    NARCIS (Netherlands)

    Loeb, I.

    2014-01-01

    We shift attention from the development of model theory for demarcated languages to the development of this theory for fragments of a language. Although it is often assumed that model theory for demarcated languages is not compatible with a universalist conception of logic, no one has denied that

  14. Adaptive mastery testing using the Rasch model and Bayesian sequential decision theory

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Vos, Hendrik J.

    1998-01-01

    A version of sequential mastery testing is studied in which response behavior is modeled by an item response theory (IRT) model. First, a general theoretical framework is sketched that is based on a combination of Bayesian sequential decision theory and item response theory. A discussion follows on

  15. Behavioral and Social Sciences Theories and Models: Are They Used in Unintentional Injury Prevention Research?

    Science.gov (United States)

    Trifiletti, L. B.; Gielen, A. C.; Sleet, D. A.; Hopkins, K.

    2005-01-01

    Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury…

  16. Personnel Recovery: Using Game Theory to Model Strategic Decision Making in the Contemporary Operating Environment

    Science.gov (United States)

    2005-06-17

    PERSONNEL RECOVERY: USING GAME THEORY TO MODEL STRATEGIC DECISION MAKING IN THE CONTEMPORARY OPERATING ENVIRONMENT A thesis...Personnel Recovery: Using Game Theory to Model Strategic Decision Making in the Contemporary Operating Environment 5c. PROGRAM ELEMENT NUMBER...As a flexible and adaptive strategic decision-making tool, game theory offers a logical way to graphically represent and compare all strategy

  17. Behavioral and social sciences theories and models: are they used in unintentional injury prevention research?

    Science.gov (United States)

    Trifiletti, L B; Gielen, A C; Sleet, D A; Hopkins, K

    2005-06-01

    Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury prevention research. The authors conducted a systematic review to evaluate the published literature from 1980 to 2001 on behavioral and social science theory applications to unintentional injury prevention and control. Electronic database searches in PubMed and PsycINFO identified articles that combined behavioral and social sciences theories and models and injury causes. The authors identified some articles that examined behavioral and social science theories and models and unintentional injury topics, but found that several important theories have never been applied to unintentional injury prevention. Among the articles identified, the PRECEDE PROCEED Model was cited most frequently, followed by the Theory of Reasoned Action/Theory of Planned Behavior and Health Belief Model. When behavioral and social sciences theories and models were applied to unintentional injury topics, they were most frequently used to guide program design, implementation or develop evaluation measures; few examples of theory testing were found. Results suggest that the use of behavioral and social sciences theories and models in unintentional injury prevention research is only marginally represented in the mainstream, peer-reviewed literature. Both the fields of injury prevention and behavioral and social sciences could benefit from greater collaborative research to enhance behavioral approaches to injury control.

  18. Emotional Connection of Military Couples after 16 Years of War: Integrating Pastoral Counseling and Evidence-Based Theory.

    Science.gov (United States)

    Cheney, Gregory J

    2017-09-01

    Sixteen years of war created significant challenges for military couples and seems to contribute to their relational distress. Military couples seek out pastoral counselors for assistance with their relational distress. Many of these pastoral counselors are military chaplains or pastors serving close to military bases. The integration of pastoral counseling with evidence-based theory is presented as an option to serve military couples in their relational distress. Emotionally Focused Couple Therapy is presented as an example.

  19. Designing evidence and theory-based ICT tools for weight loss maintenance: the H2020 NoHoW toolkit

    Directory of Open Access Journals (Sweden)

    Marta M Marques

    2015-11-01

    Conclusion: This presentation will provide an overview of the process of development of the NoHoW TK focusing on the foundations, the TK content, and the results from a pilot user-testing and will be discussing the contribution of a systematic approach to the development of ICT solutions based on theory, evidence, and user-testing, to the advancement of the science of behavior change and implementation of sustainable solutions to WLM across Europe.

  20. A brief history of string theory from dual models to M-theory

    CERN Document Server

    Rickles, Dean

    2014-01-01

    During its forty year lifespan, string theory has always had the power to divide, being called both a 'theory of everything' and a 'theory of nothing'. Critics have even questioned whether it qualifies as a scientific theory at all. This book adopts an objective stance, standing back from the question of the truth or falsity of string theory and instead focusing on how it came to be and how it came to occupy its present position in physics. An unexpectedly rich history is revealed, with deep connections to our most well-established physical theories. Fully self-contained and written in a lively fashion, the book will appeal to a wide variety of readers from novice to specialist.