Compositional models and conditional independence in evidence theory
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim; Vejnarová, Jiřina
2011-01-01
Roč. 52, č. 3 (2011), s. 316-334 ISSN 0888-613X Institutional research plan: CEZ:AV0Z10750506 Keywords : Evidence theory * Conditional independence * multidimensional models Subject RIV: BA - General Mathematics Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-0370515.pdf
A spatial Mankiw-Romer-Weil model: Theory and evidence
Fischer, Manfred M.
2009-01-01
This paper presents a theoretical growth model that extends the Mankiw-Romer-Weil [MRW] model by accounting for technological interdependence among regional economies. Interdependence is assumed to work through spatial externalities caused by disembodied knowledge diffusion. The transition from theory to econometrics leads to a reduced-form empirical spatial Durbin model specification that explains the variation in regional levels of per worker output at steady state. A system ...
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory
Directory of Open Access Journals (Sweden)
Kaijuan Yuan
2016-01-01
Full Text Available Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.
Structural reliability analysis under evidence theory using the active learning kriging model
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
Meredith, Pamela; Ownsworth, Tamara; Strong, Jenny
2008-03-01
It is now well established that pain is a multidimensional phenomenon, affected by a gamut of psychosocial and biological variables. According to diathesis-stress models of chronic pain, some individuals are more vulnerable to developing disability following acute pain because they possess particular psychosocial vulnerabilities which interact with physical pathology to impact negatively upon outcome. Attachment theory, a theory of social and personality development, has been proposed as a comprehensive developmental model of pain, implicating individual adult attachment pattern in the ontogenesis and maintenance of chronic pain. The present paper reviews and critically appraises studies which link adult attachment theory with chronic pain. Together, these papers offer support for the role of insecure attachment as a diathesis (or vulnerability) for problematic adjustment to pain. The Attachment-Diathesis Model of Chronic Pain developed from this body of literature, combines adult attachment theory with the diathesis-stress approach to chronic pain. The evidence presented in this review, and the associated model, advances our understanding of the developmental origins of chronic pain conditions, with potential application in guiding early pain intervention and prevention efforts, as well as tailoring interventions to suit specific patient needs.
Energy Technology Data Exchange (ETDEWEB)
Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)
2006-10-01
Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
Nissen, Poul
2011-01-01
In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence- based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning, which includes recognition of the influence of community, school, peers, family and the…
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Creemers, B.P.M.; Kyriakides, L.
2010-01-01
This paper refers to a dynamic perspective of educational effectiveness and improvement stressing the importance of using an evidence-based and theory-driven approach. Specifically, an approach to school improvement based on the dynamic model of educational effectiveness is offered. The recommended
From Theory to Practice: One Agency's Experience with Implementing an Evidence-Based Model.
Murray, Maureen; Culver, Tom; Farmer, Betsy; Jackson, Leslie Ann; Rixon, Brian
2014-07-01
As evidence-based practice is becoming integrated into children's mental health services as a means of improving outcomes for children and youth with severe behavioral and emotional problems, therapeutic foster care (TFC) which is a specialized treatment program for such youth, is one of few community-based programs considered to be evidence-based. "Together Facing the Challenge" (TFTC) which was developed as a component of a randomized trial of TFC has been identified as an evidence-based model. We describe the experiences reported by one of the agencies that participated in our study and how they have incorporated TFTC into their on-going practice. They highlight key implementation strategies, challenges faced, and lessons learned as they moved forward towards full implementation of TFTC throughout their agency.
Directory of Open Access Journals (Sweden)
Dongxiao Niu
2012-01-01
Full Text Available Because clean energy and traditional energy have different advantages and disadvantages, it is of great significance to evaluate comprehensive benefits for hybrid power systems. Based on thorough analysis of important characters on hybrid power systems, an index system including security, economic benefit, environmental benefit, and social benefit is established in this paper. Due to advantages of processing abundant uncertain and fuzzy information, vague set is used to determine the decision matrix. Convert vague decision matrix to real one by vague combination ruleand determine uncertain degrees of different indexes by grey incidence analysis, then the mass functions of different comment set in different indexes are obtained. Information can be fused in accordance with Dempster-Shafer (D-S combination rule and the evaluation result is got by vague set and D-S evidence theory. A simulation of hybrid power system including thermal power, wind power, and photovoltaic power in China is provided to demonstrate the effectiveness and potential of the proposed design scheme. It can be clearly seen that the uncertainties in decision making can be dramatically decreased compared with existing methods in the literature. The actual implementation results illustrate that the proposed index system and evaluation model based on vague set and D-S evidence theory are effective and practical to evaluate comprehensive benefit of hybrid power system.
The work-averse cyber attacker model : theory and evidence from two million attack signatures
Allodi, L.; Massacci, F.; Williams, J.
The typical cyber attacker is assumed to be all powerful and to exploit all possible vulnerabilities. In this paper we present, and empirically validate, a novel and more realistic attacker model. The intuition of our model is that an attacker will optimally choose whether to act and weaponize a new
GARCH Option Valuation: Theory and Evidence
DEFF Research Database (Denmark)
Christoffersen, Peter; Jacobs, Kris; Ornthanalai, Chayawat
We survey the theory and empirical evidence on GARCH option valuation models. Our treatment includes the range of functional forms available for the volatility dynamic, multifactor models, nonnormal shock distributions as well as style of pricing kernels typically used. Various strategies...... for empirical implementation are laid out and we also discuss the links between GARCH and stochastic volatility models. In the appendix we provide Matlab computer code for option pricing via Monte Carlo simulation for nonaffine models as well as Fourier inversion for affine models....
Directory of Open Access Journals (Sweden)
Lingjie Sun
2016-08-01
Full Text Available The power transformer is one of the most critical and expensive components for the stable operation of the power system. Hence, how to obtain the health condition of transformer is of great importance for power utilities. Multi-attribute decision-making (MADM, due to its ability of solving multi-source information problems, has become a quite effective tool to evaluate the health condition of transformers. Currently, the analytic hierarchy process (AHP and Dempster–Shafer theory are two popular methods to solve MADM problems; however, these techniques rarely consider one-sidedness of the single weighting method and the exclusiveness hypothesis of the Dempster–Shafer theory. To overcome these limitations, this paper introduces a novel decision-making model, which integrates the merits of fuzzy set theory, game theory and modified evidence combination extended by D numbers, to evaluate the health condition of transformers. A four-level framework, which includes three factors and seventeen sub-factors, is put forward to facilitate the evaluation model. The model points out the following: First, the fuzzy set theory is employed to obtain the original basic probability assignments for all indices. Second, the subjective and objective weights of indices, which are calculated by fuzzy AHP and entropy weight, respectively, are integrated to generate the comprehensive weights based on game theory. Finally, based on the above two steps, the modified evidence combination extended by D numbers, which avoids the limitation of the exclusiveness hypothesis in the application of Dempster–Shafer theory, is proposed to obtain the final assessment results of transformers. Case studies are given to demonstrate the proposed modeling process. The results show the effectiveness and engineering practicability of the model in transformer condition assessment.
Prest, M
1988-01-01
In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module
Liu, Yuwei; Sheng, Hong; Mundorf, Norbert; Redding, Colleen; Ye, Yinjiao
2017-12-18
With increasing urbanization in China, many cities are facing serious environmental problems due to continuous and substantial increase in automobile transportation. It is becoming imperative to examine effective ways to reduce individual automobile use to facilitate sustainable transportation behavior. Empirical, theory-based research on sustainable transportation in China is limited. In this research, we propose an integrated model based on the norm activation model and the theory of planned behavior by combining normative and rational factors to predict individuals' intention to reduce car use. Data from a survey of 600 car drivers in China's three metropolitan areas was used to test the proposed model and hypotheses. Results showed that three variables, perceived norm of car-transport reduction, attitude towards reduction, and perceived behavior control over car-transport reduction, significantly affected the intention to reduce car-transport. Personal norms mediated the relationship between awareness of consequences of car-transport, ascription of responsibility of car-transport, perceived subjective norm for car-transport reduction, and intention to reduce car-transport. The results of this research not only contribute to theory development in the area of sustainable transportation behavior, but also provide a theoretical frame of reference for relevant policy-makers in urban transport management.
Directory of Open Access Journals (Sweden)
Yuwei Liu
2017-12-01
Full Text Available With increasing urbanization in China, many cities are facing serious environmental problems due to continuous and substantial increase in automobile transportation. It is becoming imperative to examine effective ways to reduce individual automobile use to facilitate sustainable transportation behavior. Empirical, theory-based research on sustainable transportation in China is limited. In this research, we propose an integrated model based on the norm activation model and the theory of planned behavior by combining normative and rational factors to predict individuals’ intention to reduce car use. Data from a survey of 600 car drivers in China’s three metropolitan areas was used to test the proposed model and hypotheses. Results showed that three variables, perceived norm of car-transport reduction, attitude towards reduction, and perceived behavior control over car-transport reduction, significantly affected the intention to reduce car-transport. Personal norms mediated the relationship between awareness of consequences of car-transport, ascription of responsibility of car-transport, perceived subjective norm for car-transport reduction, and intention to reduce car-transport. The results of this research not only contribute to theory development in the area of sustainable transportation behavior, but also provide a theoretical frame of reference for relevant policy-makers in urban transport management.
Computational mate choice: theory and empirical evidence.
Castellano, Sergio; Cadeddu, Giorgia; Cermelli, Paolo
2012-06-01
The present review is based on the thesis that mate choice results from information-processing mechanisms governed by computational rules and that, to understand how females choose their mates, we should identify which are the sources of information and how they are used to make decisions. We describe mate choice as a three-step computational process and for each step we present theories and review empirical evidence. The first step is a perceptual process. It describes the acquisition of evidence, that is, how females use multiple cues and signals to assign an attractiveness value to prospective mates (the preference function hypothesis). The second step is a decisional process. It describes the construction of the decision variable (DV), which integrates evidence (private information by direct assessment), priors (public information), and value (perceived utility) of prospective mates into a quantity that is used by a decision rule (DR) to produce a choice. We make the assumption that females are optimal Bayesian decision makers and we derive a formal model of DV that can explain the effects of preference functions, mate copying, social context, and females' state and condition on the patterns of mate choice. The third step of mating decision is a deliberative process that depends on the DRs. We identify two main categories of DRs (absolute and comparative rules), and review the normative models of mate sampling tactics associated to them. We highlight the limits of the normative approach and present a class of computational models (sequential-sampling models) that are based on the assumption that DVs accumulate noisy evidence over time until a decision threshold is reached. These models force us to rethink the dichotomy between comparative and absolute decision rules, between discrimination and recognition, and even between rational and irrational choice. Since they have a robust biological basis, we think they may represent a useful theoretical tool for
Critical evidence for the prediction error theory in associative learning.
Terao, Kanta; Matsumoto, Yukihisa; Mizunami, Makoto
2015-03-10
In associative learning in mammals, it is widely accepted that the discrepancy, or error, between actual and predicted reward determines whether learning occurs. Complete evidence for the prediction error theory, however, has not been obtained in any learning systems: Prediction error theory stems from the finding of a blocking phenomenon, but blocking can also be accounted for by other theories, such as the attentional theory. We demonstrated blocking in classical conditioning in crickets and obtained evidence to reject the attentional theory. To obtain further evidence supporting the prediction error theory and rejecting alternative theories, we constructed a neural model to match the prediction error theory, by modifying our previous model of learning in crickets, and we tested a prediction from the model: the model predicts that pharmacological intervention of octopaminergic transmission during appetitive conditioning impairs learning but not formation of reward prediction itself, and it thus predicts no learning in subsequent training. We observed such an "auto-blocking", which could be accounted for by the prediction error theory but not by other competitive theories to account for blocking. This study unambiguously demonstrates validity of the prediction error theory in associative learning.
Learned Helplessness: Theory and Evidence
Maier, Steven F.; Seligman, Martin E. P.
1976-01-01
Authors believes that three phenomena are all instances of "learned helplessness," instances in which an organism has learned that outcomes are uncontrollable by his responses and is seriously debilitated by this knowledge. This article explores the evidence for the phenomena of learned helplessness, and discussed a variety of theoretical…
Directory of Open Access Journals (Sweden)
Kehinde Anthony Mogaji
2018-06-01
Full Text Available The application of a GIS – based Dempster – Shafer data driven model named as evidential belief function EBF- methodology to groundwater potential conditioning factors (GPCFs derived from geophysical and hydrogeological data sets for assessing groundwater potentiality was presented in this study. The proposed method’s efficacy in managing degree of uncertainty in spatial predictive models motivated this research. The method procedural approaches entail firstly, the database containing groundwater data records (bore wells location inventory, hydrogeological data record, etc. and geophysical measurement data construction. From the database, different influencing groundwater occurrence factors, namely aquifer layer thickness, aquifer layer resistivity, overburden material resistivity, overburden material thickness, aquifer hydraulic conductivity and aquifer transmissivity were extracted and prepared. Further, the bore well location inventories were partitioned randomly into a ratio of 70% (19 wells for model training and 30% (9 wells for model testing. The synthesized of the GPCFs via applying the DS – EBF model algorithms produced the groundwater productivity potential index (GPPI map which demarcated the area into low – medium, medium, medium – high and high potential zones. The analyzed percentage degree of uncertainty for the predicted lows potential zones classes and mediums/highs potential zones classes are >10% and <10%, respectively. The DS theory model-based GPPI map’s validation through ROC approach established prediction rate accuracy of 88.8%. Successively, the determined transverse resistance (TR values in the range of 1280 and 30,000 Ω my for the area geoelectrically delineated aquifer units of the predicted potential zones through Dar – Zarrouk Parameter analysis quantitatively confirm the DS theory modeling prediction results. This research results have expand the capability of DS – EBF model in predictive
Markowitz, Sarah M; Arent, Shawn M
2010-10-01
This study examined the relationship between exertion level and affect using the framework of opponent-process theory and the dual-mode model, with the Activation-Deactivation Adjective Checklist and the State Anxiety Inventory among 14 active and 14 sedentary participants doing 20 min of treadmill exercise at speeds of 5% below, 5% above, and at lactate threshold (LT). We found a significant effect of time, condition, Time × Condition, and Time × Group, but no group, Group × Condition, or Time × Group × Condition effects, such that the 5% above LT condition produced a worsening of affect in-task compared with all other conditions whereas, across conditions, participants experienced in-task increases in energy and tension, and in-task decreases in tiredness and calmness relative to baseline. Posttask, participants experienced mood improvement (decreased tension, anxiety, and increased calmness) across conditions, with a 30-min delay in the above LT condition. These results partially support the dual-mode model and a modified opponent-process theory.
Employee Screening : Theory and Evidence
Fali Huang; Peter Cappelli
2007-01-01
Arguably the fundamental problem faced by employers is how to elicit effort from employees. Most models suggest that employers meet this challenge by monitoring employees carefully to prevent shirking. But there is another option that relies on heterogeneity across employees, and that is to screen job candidates to find workers with a stronger work ethic who require less monitoring. This should be especially useful in work systems where monitoring by supervisors is more difficult, such as tea...
Measuring uncertainty within the theory of evidence
Salicone, Simona
2018-01-01
This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...
Some empirical evidence for ecological dissonance theory.
Miller, D I; Verhoek-Miller, N; Giesen, J M; Wells-Parker, E
2000-04-01
Using Festinger's cognitive dissonance theory as a model, the extension to Barker's ecological theory, referred to as ecological dissonance theory, was developed. Designed to examine the motivational dynamics involved when environmental systems are in conflict with each other or with cognitive systems, ecological dissonance theory yielded five propositions which were tested in 10 studies. This summary of the studies suggests operationally defined measures of ecological dissonance may correlate with workers' satisfaction with their jobs, involvement with their jobs, alienation from their work, and to a lesser extent, workers' conflict resolution behavior and communication style.
Local computations in Dempster-Shafer theory of evidence
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim
2012-01-01
Roč. 53, č. 8 (2012), s. 1155-1167 ISSN 0888-613X Grant - others:GA ČR(CZ) GAP403/12/2175 Program:GA Institutional support: RVO:67985556 Keywords : Discrete belief functions * Dempster-Shafer theory * conditional independence * decomposable model Subject RIV: IN - Informatics, Computer Science Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-local computations in dempster–shafer theory of evidence. pdf
Mogaji, Kehinde Anthony; Lim, Hwee San
2018-06-01
The application of a GIS - based Dempster - Shafer data driven model named as evidential belief function EBF- methodology to groundwater potential conditioning factors (GPCFs) derived from geophysical and hydrogeological data sets for assessing groundwater potentiality was presented in this study. The proposed method's efficacy in managing degree of uncertainty in spatial predictive models motivated this research. The method procedural approaches entail firstly, the database containing groundwater data records (bore wells location inventory, hydrogeological data record, etc.) and geophysical measurement data construction. From the database, different influencing groundwater occurrence factors, namely aquifer layer thickness, aquifer layer resistivity, overburden material resistivity, overburden material thickness, aquifer hydraulic conductivity and aquifer transmissivity were extracted and prepared. Further, the bore well location inventories were partitioned randomly into a ratio of 70% (19 wells) for model training and 30% (9 wells) for model testing. The synthesized of the GPCFs via applying the DS - EBF model algorithms produced the groundwater productivity potential index (GPPI) map which demarcated the area into low - medium, medium, medium - high and high potential zones. The analyzed percentage degree of uncertainty for the predicted lows potential zones classes and mediums/highs potential zones classes are >10% and used by local authorities for groundwater exploitation and management in the area.
Holman, Gordon D.
1989-01-01
The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.
Stock portfolio selection using Dempster–Shafer evidence theory
Directory of Open Access Journals (Sweden)
Gour Sundar Mitra Thakur
2018-04-01
Full Text Available Markowitz’s return–risk model for stock portfolio selection is based on the historical return data of assets. In addition to the effect of historical return, there are many other critical factors which directly or indirectly influence the stock market. We use the fuzzy Delphi method to identify the critical factors initially. Factors having lower correlation coefficients are finally considered for further consideration. The critical factors and historical data are used to apply Dempster–Shafer evidence theory to rank the stocks. Then, a portfolio selection model that prefers stocks with higher rank is proposed. Illustration is done using stocks under Bombay Stock Exchange (BSE. Simulation is done by Ant Colony Optimization. The performance of the outcome is found satisfactory when compared with recent performance of the assets. Keywords: Stock portfolio selection, Ranking, Dempster–Shafer evidence theory, Ant Colony Optimization, Fuzzy Delphi method
Self Modeling: Expanding the Theories of Learning
Dowrick, Peter W.
2012-01-01
Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…
Jonas Olson's Evidence for Moral Error Theory
Evers, Daan
2016-01-01
Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although
MACCIA, ELIZABETH S.; AND OTHERS
AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…
Alternative banking: theory and evidence from Europe
Directory of Open Access Journals (Sweden)
Kurt Von Mettenheim
2012-12-01
Full Text Available Since financial liberalization in the 1980s, non-profit maximizing, stakeholder-oriented banks have outperformed private banks in Europe. This article draws on empirical research, banking theory and theories of the firm to explain this apparent anomaly for neo-liberal policy and contemporary market-based banking theory. The realization of competitive advantages by alternative banks (savings banks, cooperative banks and development banks has significant implications for conceptions of bank change, regulation and political economy.
Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory
Directory of Open Access Journals (Sweden)
Yafei Song
2015-01-01
Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.
Book Review: Market Liquidity: Theory, Evidence, and Policy
DEFF Research Database (Denmark)
Boscan, Luis
2014-01-01
Review of: Market Liquidity: Theory, Evidence, and Policy / by Thierry Foucault, Marco Pagano and Ailsa Röell. Oxford University Press. April 2013.......Review of: Market Liquidity: Theory, Evidence, and Policy / by Thierry Foucault, Marco Pagano and Ailsa Röell. Oxford University Press. April 2013....
Aligning Grammatical Theories and Language Processing Models
Lewis, Shevaun; Phillips, Colin
2015-01-01
We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…
Lectures on algebraic model theory
Hart, Bradd
2001-01-01
In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.
Model integration and a theory of models
Dolk, Daniel R.; Kottemann, Jeffrey E.
1993-01-01
Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...
Warped models in string theory
International Nuclear Information System (INIS)
Acharya, B.S.; Benini, F.; Valandro, R.
2006-12-01
Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)
Combat Risk and Pay: Theory and Some Evidence
2011-10-01
1776) theory of compensating differences, and Rosen (1986) devised what has become the standard neoclassical economic theory relating wages to the...I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Paper P-4774 October 2011 Combat Risk and Pay: Theory and Some Evidence Curtis J. Simon...OCT 2011 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Combat Risk and Pay: Theory and Some Evidence 5a. CONTRACT NUMBER 5b
Model Theory in Algebra, Analysis and Arithmetic
Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J
2014-01-01
Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.
Uncertainty quantification using evidence theory in multidisciplinary design optimization
International Nuclear Information System (INIS)
Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh
2004-01-01
Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems
Physics Without Causality — Theory and Evidence
Shoup, Richard
2006-10-01
The principle of cause and effect is deeply rooted in human experience, so much so that it is routinely and tacitly assumed throughout science, even by scientists working in areas where time symmetry is theoretically ingrained, as it is in both classical and quantum physics. Experiments are said to cause their results, not the other way around. In this informal paper, we argue that this assumption should be replaced with a more general notion of mutual influence — bi-directional relations or constraints on joint values of two or more variables. From an analysis based on quantum entropy, it is proposed that quantum measurement is a unitary three-interaction, with no collapse, no fundamental randomness, and no barrier to backward influence. Experimental results suggesting retrocausality are seen frequently in well-controlled laboratory experiments in parapsychology and elsewhere, especially where a random element is included. Certain common characteristics of these experiments give the appearance of contradicting well-established physical laws, thus providing an opportunity for deeper understanding and important clues that must be addressed by any explanatory theory. We discuss how retrocausal effects and other anomalous phenomena can be explained without major injury to existing physical theory. A modified quantum formalism can give new insights into the nature of quantum measurement, randomness, entanglement, causality, and time.
Children Balance Theories and Evidence in Exploration, Explanation, and Learning
Bonawitz, Elizabeth Baraff; van Schijndel, Tessa J. P.; Friel, Daniel; Schulz, Laura
2012-01-01
We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children, mean: 62 months). Center and Mass Theory children who…
Collateral and the limits of debt capacity: theory and evidence
Giambona, E.; Mello, A.S.; Riddiough, T.
2012-01-01
This paper considers how collateral is used to finance a going concern, and demonstrates with theory and evidence that there are effective limits to debt capacity and the kinds of claims that are issued to deploy that debt capacity. The theory shows that firms with (unobservably) better quality
1982-08-01
accomplish the task, (2) the instrumentality of task performance for job outcomes, and (3) the instrumentality of outcomes for need satisfaction . We...in this discussion: effort, performance , outcomes, and needs. In order to present briefly the conventional approach to the Vroom models, another...Presumably, this is the final event in the sequence of effort, performance , outcome, and need satisfaction . The actual research reported in expectancy
Adaptive vs. eductive learning : Theory and evidence
Bao, T.; Duffy, J.
2014-01-01
Adaptive learning and eductive learning are two widely used ways of modeling learning behavior in macroeconomics. Both approaches yield restrictions on model parameters under which agents are able to learn a rational expectation equilibrium (REE) but these restrictions do not always overlap with one
Inequality, redistribution and growth : Theory and evidence
Haile, D.
2005-01-01
From a macro-perspective, the thesis provides a political economic model that analyses the joint determination of inequality, corruption, taxation, education and economic growth in a dynamic environment. It demonstrates how redistributive taxation is affected by the distribution of wealth and
Theory- and evidence-based Intervention
DEFF Research Database (Denmark)
Nissen, Poul
2011-01-01
of the influence of community, school, peers, famely and the functional and structural domains of personality at the behavioural, psenomenological, intra-psychic and biophysical level in a dialectical process. One important aspect of the theoretical basis for presentation of this model is that the child...
Minisuperspace models in histories theory
International Nuclear Information System (INIS)
Anastopoulos, Charis; Savvidou, Ntina
2005-01-01
We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context
Faster-X evolution: Theory and evidence from Drosophila.
Charlesworth, Brian; Campos, José L; Jackson, Benjamin C
2018-02-12
A faster rate of adaptive evolution of X-linked genes compared with autosomal genes can be caused by the fixation of recessive or partially recessive advantageous mutations, due to the full expression of X-linked mutations in hemizygous males. Other processes, including recombination rate and mutation rate differences between X chromosomes and autosomes, may also cause faster evolution of X-linked genes. We review population genetics theory concerning the expected relative values of variability and rates of evolution of X-linked and autosomal DNA sequences. The theoretical predictions are compared with data from population genomic studies of several species of Drosophila. We conclude that there is evidence for adaptive faster-X evolution of several classes of functionally significant nucleotides. We also find evidence for potential differences in mutation rates between X-linked and autosomal genes, due to differences in mutational bias towards GC to AT mutations. Many aspects of the data are consistent with the male hemizygosity model, although not all possible confounding factors can be excluded. © 2018 John Wiley & Sons Ltd.
Evidence Combination From an Evolutionary Game Theory Perspective.
Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu
2016-09-01
Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.
Constructing a New Theory from Old Ideas and New Evidence
Rhodes, Marjorie; Wellman, Henry
2013-01-01
A central tenet of constructivist models of conceptual development is that children's initial conceptual level constrains how they make sense of new evidence and thus whether exposure to evidence will prompt conceptual change. Yet little experimental evidence directly examines this claim for the case of sustained, fundamental conceptual…
Foundations of compositional model theory
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim
2011-01-01
Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf
Children balance theories and evidence in exploration, explanation, and learning
Bonawitz, E.B.; van Schijndel, T.J.P.; Friel, D.; Schulz, L.
2012-01-01
We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children,
A theory of evidence for undeclared nuclear activities
International Nuclear Information System (INIS)
King, J.L.
1995-01-01
The IAEA has recently explored techniques to augment and improve its existing safeguards information systems as part of Program 93 + 2 in order to address the detection of undeclared activities. Effective utilization of information on undeclared activities requires a formulation of the relationship between the information being gathered and the resulting safeguards assurance. The process of safeguards is represented as the gathering of evidence to provide assurance that no undeclared activities take place. It is shown that the analysis of this process can be represented by a theory grounded in the Dempster-Shafer theory of evidence and the concept of possibility. This paper presents the underlying evidence theory required to support a new information system tool for the analysis of information with respect to undeclared activities. The Dempster-Shafer theory serves as the calculus for the combination of diverse sources of evidence, and when applied to safeguards information, provides a basis for interpreting the result of safeguards indicators and measurements -- safeguards assurance
TIM Series: Theory, Evidence and the Pragmatic Manager
Directory of Open Access Journals (Sweden)
Steven Muegge
2008-08-01
Full Text Available On July 2, 2008, Steven Muegge from Carleton University delivered a presentation entitled "Theory, Evidence and the Pragmatic Manager". This section provides the key messages from the lecture. The scope of this lecture spanned several topics, including management decision making, forecasting and its limitations, the psychology of expertise, and the management of innovation.
Five roles for using theory and evidence in the design and testing of behavior change interventions.
Bartholomew, L Kay; Mullen, Patricia Dolan
2011-01-01
The prevailing wisdom in the field of health-related behavior change is that well-designed and effective interventions are guided by theory. Using the framework of intervention mapping, we describe and provide examples of how investigators can effectively select and use theory to design, test, and report interventions. We propose five roles for theory and evidence about theories: a) identification of behavior and determinants of behavior related to a specified health problem (i.e., the logic model of the problem); b) explication of a causal model that includes theoretical constructs for producing change in the behavior of interest (i.e., the logic model of change); c) selection of intervention methods and delivery of practical applications to achieve changes in health behavior; d) evaluation of the resulting intervention including theoretical mediating variables; and e) reporting of the active ingredients of the intervention together with the evaluation results. In problem-driven applied behavioral or social science, researchers use one or multiple theories, empiric evidence, and new research, both to assess a problem and to solve or prevent a problem. Furthermore, the theories for description of the problem may differ from the theories for its solution. In an applied approach, the main focus is on solving problems regarding health behavior change and improvement of health outcomes, and the criteria for success are formulated in terms of the problem rather than the theory. Resulting contributions to theory development may be quite useful, but they are peripheral to the problem-solving process.
Superfield theory and supermatrix model
International Nuclear Information System (INIS)
Park, Jeong-Hyuck
2003-01-01
We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)
Models in cooperative game theory
Branzei, Rodica; Tijs, Stef
2008-01-01
This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.
Field theory and the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Dudas, E [Orsay, LPT (France)
2014-07-01
This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.
Lattice models and conformal field theories
International Nuclear Information System (INIS)
Saleur, H.
1988-01-01
Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied
Dark energy observational evidence and theoretical models
Novosyadlyj, B; Shtanov, Yu; Zhuk, A
2013-01-01
The book elucidates the current state of the dark energy problem and presents the results of the authors, who work in this area. It describes the observational evidence for the existence of dark energy, the methods and results of constraining of its parameters, modeling of dark energy by scalar fields, the space-times with extra spatial dimensions, especially Kaluza---Klein models, the braneworld models with a single extra dimension as well as the problems of positive definition of gravitational energy in General Relativity, energy conditions and consequences of their violation in the presence of dark energy. This monograph is intended for science professionals, educators and graduate students, specializing in general relativity, cosmology, field theory and particle physics.
Halo modelling in chameleon theories
Energy Technology Data Exchange (ETDEWEB)
Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)
2014-03-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.
Halo modelling in chameleon theories
International Nuclear Information System (INIS)
Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu
2014-01-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations
Stochastic models: theory and simulation.
Energy Technology Data Exchange (ETDEWEB)
Field, Richard V., Jr.
2008-03-01
Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.
Quiver gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Yagi, Junya
2015-01-01
We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.
Qigong in Cancer Care: Theory, Evidence-Base, and Practice
Directory of Open Access Journals (Sweden)
Penelope Klein
2017-01-01
Full Text Available Background: The purpose of this discussion is to explore the theory, evidence base, and practice of Qigong for individuals with cancer. Questions addressed are: What is qigong? How does it work? What evidence exists supporting its practice in integrative oncology? What barriers to wide-spread programming access exist? Methods: Sources for this discussion include a review of scholarly texts, the Internet, PubMed, field observations, and expert opinion. Results: Qigong is a gentle, mind/body exercise integral within Chinese medicine. Theoretical foundations include Chinese medicine energy theory, psychoneuroimmunology, the relaxation response, the meditation effect, and epigenetics. Research supports positive effects on quality of life (QOL, fatigue, immune function and cortisol levels, and cognition for individuals with cancer. There is indirect, scientific evidence suggesting that qigong practice may positively influence cancer prevention and survival. No one Qigong exercise regimen has been established as superior. Effective protocols do have common elements: slow mindful exercise, easy to learn, breath regulation, meditation, emphasis on relaxation, and energy cultivation including mental intent and self-massage. Conclusions: Regular practice of Qigong exercise therapy has the potential to improve cancer-related QOL and is indirectly linked to cancer prevention and survival. Wide-spread access to quality Qigong in cancer care programming may be challenged by the availability of existing programming and work force capacity.
Event-related potential evidence for the processing efficiency theory.
Murray, N P; Janelle, C M
2007-01-15
The purpose of this study was to examine the central tenets of the processing efficiency theory using psychophysiological measures of attention and effort. Twenty-eight participants were divided equally into either a high or low trait anxiety group. They were then required to perform a simulated driving task while responding to one of four target light-emitting diodes. Cortical activity and dual task performance were recorded under two conditions -- baseline and competition -- with cognitive anxiety being elevated in the competitive session by an instructional set. Although driving speed was similar across sessions, a reduction in P3 amplitude to cue onset in the light detection task occurred for both groups during the competitive session, suggesting a reduction in processing efficiency as participants became more state anxious. Our findings provide more comprehensive and mechanistic evidence for processing efficiency theory, and confirm that increases in cognitive anxiety can result in a reduction of processing efficiency with little change in performance effectiveness.
A Model for Evidence Accumulation in the Lexical Decision Task
Wagenmakers, Eric-Jan; Steyvers, Mark; Raaijmakers, Jeroen G. W.; Shiffrin, Richard M.; van Rijn, Hedderik; Zeelenberg, Rene
2004-01-01
We present a new model for lexical decision, REM-LD, that is based on REM theory (e.g., Shiffrin & Steyvers, 1997). REM-LD uses a principled (i.e., Bayes' rule) decision process that simultaneously considers the diagnosticity of the evidence for the 'WORD' response and the 'NONWORD' response. The model calculates the odds ratio that the presented…
New Pathways between Group Theory and Model Theory
Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz
2017-01-01
This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...
Galaxy Alignments: Theory, Modelling & Simulations
Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais
2015-11-01
The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.
Applications of model theory to functional analysis
Iovino, Jose
2014-01-01
During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the
Huang, Yun-An; Jastorff, Jan; Van den Stock, Jan; Van de Vliet, Laura; Dupont, Patrick; Vandenbulcke, Mathieu
2018-05-15
Psychological construction models of emotion state that emotions are variable concepts constructed by fundamental psychological processes, whereas according to basic emotion theory, emotions cannot be divided into more fundamental units and each basic emotion is represented by a unique and innate neural circuitry. In a previous study, we found evidence for the psychological construction account by showing that several brain regions were commonly activated when perceiving different emotions (i.e. a general emotion network). Moreover, this set of brain regions included areas associated with core affect, conceptualization and executive control, as predicted by psychological construction models. Here we investigate directed functional brain connectivity in the same dataset to address two questions: 1) is there a common pathway within the general emotion network for the perception of different emotions and 2) if so, does this common pathway contain information to distinguish between different emotions? We used generalized psychophysiological interactions and information flow indices to examine the connectivity within the general emotion network. The results revealed a general emotion pathway that connects neural nodes involved in core affect, conceptualization, language and executive control. Perception of different emotions could not be accurately classified based on the connectivity patterns from the nodes of the general emotion pathway. Successful classification was achieved when connections outside the general emotion pathway were included. We propose that the general emotion pathway functions as a common pathway within the general emotion network and is involved in shared basic psychological processes across emotions. However, additional connections within the general emotion network are required to classify different emotions, consistent with a constructionist account. Copyright © 2018 Elsevier Inc. All rights reserved.
Bureaucratic Minimal Squawk Behavior: Theory and Evidence from Regulatory Agencies
Clare Leaver
2009-01-01
This paper argues that bureaucrats are susceptible to `minimal squawk` behavior. I develop a simple model in which a desire to avoid criticism can prompt, otherwise public-spirited, bureaucrats to behave inefficiently. Decisions are taken to keep interest groups quiet and mistakes out of the public eye. The policy implications of this behavior are at odds with the received view that agencies should be structured to minimise the threat of `capture`. I test between theories of bureaucratic beha...
Export Growth and Factor Market Competition: Theory and Some Evidence
J. Emami Namini (Julian); G. Facchini (Giovanni); R.A. Lopez (Ricrado)
2011-01-01
textabstractEmpirical evidence suggests that sectoral export growth decreases exporters' survival probability, whereas this is not true for non-exporters. Models with firm heterogeneity in total factor productivity (TFP) predict the opposite. To solve this puzzle, we develop a two{factor framework
Evidence accumulation as a model for lexical selection.
Anders, R; Riès, S; van Maanen, L; Alario, F X
2015-11-01
We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.
Theories, Models and Methodology in Writing Research
Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel
1996-01-01
Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the
The Friction Theory for Viscosity Modeling
DEFF Research Database (Denmark)
Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan
2001-01-01
, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...
Theory and model use in social marketing health interventions.
Luca, Nadina Raluca; Suggs, L Suzanne
2013-01-01
The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.
Targeting the Real Exchange Rate; Theory and Evidence
Carlos A. Végh Gramont; Guillermo Calvo; Carmen Reinhart
1994-01-01
This paper presents a theoretical and empirical analysis of policies aimed at setting a more depreciated level of the real exchange rate. An intertemporal optimizing model suggests that, in the absence of changes in fiscal policy, a more depreciated level of the real exchange can only be attained temporarily. This can be achieved by means of higher inflation and/or higher real interest rates, depending on the degree of capital mobility. Evidence for Brazil, Chile, and Colombia supports the mo...
Crisis in Context Theory: An Ecological Model
Myer, Rick A.; Moore, Holly B.
2006-01-01
This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…
Constraint theory multidimensional mathematical model management
Friedman, George J
2017-01-01
Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...
Staircase Models from Affine Toda Field Theory
Dorey, P; Dorey, Patrick; Ravanini, Francesco
1993-01-01
We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.
Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions
International Nuclear Information System (INIS)
Shah, Harsheel; Hosder, Serhat; Winter, Tyler
2015-01-01
The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)
Short-run Exchange-Rate Dynamics: Theory and Evidence
DEFF Research Database (Denmark)
Carlson, John A.; Dahl, Christian Møller; Osler, Carol L.
Recent research has revealed a wealth of information about the microeconomics of currency markets and thus the determination of exchange rates at short horizons. This information is valuable to us as scientists since, like evidence of macroeconomic regularities, it can provide critical guidance...... of currency markets, it accurately reflects the constraints and objectives faced by the major participants, and it fits key stylized facts concerning returns and order flow. With respect to macroeconomics, the model is consistent with most of the major puzzles that have emerged under floating rates....
Reconstructing bidimensional scalar field theory models
International Nuclear Information System (INIS)
Flores, Gabriel H.; Svaiter, N.F.
2001-07-01
In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)
Motives and chances of firm diversification: theory and empirical evidence
International Nuclear Information System (INIS)
Briglauer, W.
2001-11-01
It is beyond controversy that the majority of the largest companies in the industrialized countries perform to a certain extent product diversification strategies. Tying up to this finding the underlying work firstly deals with alternative theoretical and empirical definitions of corporate diversification. Subsequently the theoretical part mainly elaborates an industrial economic framework for categorizing motives of firm diversification. Despite of some inevitable degree of arbitrariness, a relatively widespread and sufficient categorization can be presented. With regards to the relevant economic literature most explanations of product diversification can be classified appropriately. Observing diversification activities one would prima facie infer a positive relationship between product diversification and firm performance, but both, theory and empirical evidence, yield ambiguous results. The empirical part provides a list of existing studies, classified according to the theoretical categorization. In an overview some stylised facts are filtered and discussed consecutively. Most notably, it was found that related diversification strategies significantly outperform strategies of unrelated diversification. At the end of the empirical section econometric methods are applied to agricultural and industrial economic (relating to telecommunication markets) data sets. For the agricultural studies a significantly positive relationship between product diversification and firm performance was found. In contrast no significant results were obtained for the telecommunication markets. (author)
Advances in cognitive theory and therapy: the generic cognitive model.
Beck, Aaron T; Haigh, Emily A P
2014-01-01
For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.
A course on basic model theory
Sarbadhikari, Haimanti
2017-01-01
This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.
Gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Witten, E.
1989-01-01
Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)
Cluster model in reaction theory
International Nuclear Information System (INIS)
Adhikari, S.K.
1979-01-01
A recent work by Rosenberg on cluster states in reaction theory is reexamined and generalized to include energies above the threshold for breakup into four composite fragments. The problem of elastic scattering between two interacting composite fragments is reduced to an equivalent two-particle problem with an effective potential to be determined by extremum principles. For energies above the threshold for breakup into three or four composite fragments effective few-particle potentials are introduced and the problem is reduced to effective three- and four-particle problems. The equivalent three-particle equation contains effective two- and three-particle potentials. The effective potential in the equivalent four-particle equation has two-, three-, and four-body connected parts and a piece which has two independent two-body connected parts. In the equivalent three-particle problem we show how to include the effect of a weak three-body potential perturbatively. In the equivalent four-body problem an approximate simple calculational scheme is given when one neglects the four-particle potential the effect of which is presumably very small
Economic Modelling in Institutional Economic Theory
Directory of Open Access Journals (Sweden)
Wadim Strielkowski
2017-06-01
Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.
Randomized Item Response Theory Models
Fox, Gerardus J.A.
2005-01-01
The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by
Graphical Model Theory for Wireless Sensor Networks
International Nuclear Information System (INIS)
Davis, William B.
2002-01-01
Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm
Topological quantum theories and integrable models
International Nuclear Information System (INIS)
Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.
1991-01-01
The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit
Numerical evidence of chiral magnetic effect in lattice gauge theory
International Nuclear Information System (INIS)
Buividovich, P. V.; Chernodub, M. N.; Luschevskaya, E. V.; Polikarpov, M. I.
2009-01-01
The chiral magnetic effect is the generation of electric current of quarks along an external magnetic field in the background of topologically nontrivial gluon fields. There is recent evidence that this effect is observed by the STAR Collaboration in heavy-ion collisions at the Relativistic Heavy Ion Collider. In our paper we study qualitative signatures of the chiral magnetic effect using quenched lattice simulations. We find indications that the electric current is indeed enhanced in the direction of the magnetic field both in equilibrium configurations of the quantum gluon fields and in a smooth gluon background with nonzero topological charge. In the confinement phase the magnetic field enhances the local fluctuations of both the electric charge and chiral charge densities. In the deconfinement phase the effects of the magnetic field become smaller, possibly due to thermal screening. Using a simple model of a fireball we obtain a good agreement between our data and experimental results of STAR Collaboration.
Security Theorems via Model Theory
Directory of Open Access Journals (Sweden)
Joshua Guttman
2009-11-01
Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.
Vacation queueing models theory and applications
Tian, Naishuo
2006-01-01
A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...
Quantum field theory and the standard model
Schwartz, Matthew D
2014-01-01
Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...
International Nuclear Information System (INIS)
Schlingemann, D.
1996-10-01
Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)
J. Brug (Hans); A. Oenema (Anke); A. Ferreira (Isabel)
2005-01-01
textabstractBACKGROUND: The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. DISCUSSION: Since behavior theory is a reflection of the compiled evidence of behavior research, theory is
Civil-Military Relations and Strategy: Theory and Evidence
National Research Council Canada - National Science Library
Kimminau, Jon
2001-01-01
... between civilian and military strategy. There are a number of propositions about such differences that lie at the heart of theories of state and group behavior at international and domestic levels...
Introducing Evidence Through Research "Push": Using Theory and Qualitative Methods.
Morden, Andrew; Ong, Bie Nio; Brooks, Lauren; Jinks, Clare; Porcheret, Mark; Edwards, John J; Dziedzic, Krysia S
2015-11-01
A multitude of factors can influence the uptake and implementation of complex interventions in health care. A plethora of theories and frameworks recognize the need to establish relationships, understand organizational dynamics, address context and contingency, and engage key decision makers. Less attention is paid to how theories that emphasize relational contexts can actually be deployed to guide the implementation of an intervention. The purpose of the article is to demonstrate the potential role of qualitative research aligned with theory to inform complex interventions. We detail a study underpinned by theory and qualitative research that (a) ensured key actors made sense of the complex intervention at the earliest stage of adoption and (b) aided initial engagement with the intervention. We conclude that using theoretical approaches aligned with qualitative research can provide insights into the context and dynamics of health care settings that in turn can be used to aid intervention implementation. © The Author(s) 2015.
Key Elasticities in Job Search Theory : International Evidence
Addison, John T.; Centeno, Mário; Portugal, Pedro
2004-01-01
This paper exploits the informational value of search theory, after Lancaster and Chesher (1983), in conjunction with survey data on the unemployed to calculate key reservation wage and duration elasticities for most EU-15 nations.
Post, Brady; Buchmueller, Tom; Ryan, Andrew M
2017-08-01
Hospital-physician vertical integration is on the rise. While increased efficiencies may be possible, emerging research raises concerns about anticompetitive behavior, spending increases, and uncertain effects on quality. In this review, we bring together several of the key theories of vertical integration that exist in the neoclassical and institutional economics literatures and apply these theories to the hospital-physician relationship. We also conduct a literature review of the effects of vertical integration on prices, spending, and quality in the growing body of evidence ( n = 15) to evaluate which of these frameworks have the strongest empirical support. We find some support for vertical foreclosure as a framework for explaining the observed results. We suggest a conceptual model and identify directions for future research. Based on our analysis, we conclude that vertical integration poses a threat to the affordability of health services and merits special attention from policymakers and antitrust authorities.
Organizational dimensions of relationship-centered care. Theory, evidence, and practice.
Safran, Dana Gelb; Miller, William; Beckman, Howard
2006-01-01
Four domains of relationship have been highlighted as the cornerstones of relationship-centered health care. Of these, clinician-patient relationships have been most thoroughly studied, with a rich empirical literature illuminating significant linkages between clinician-patient relationship quality and a wide range of outcomes. This paper explores the realm of clinician-colleague relationships, which we define to include the full array of relationships among clinicians, staff, and administrators in health care organizations. Building on a stream of relevant theories and empirical literature that have emerged over the past decade, we synthesize available evidence on the role of organizational culture and relationships in shaping outcomes, and posit a model of relationship-centered organizations. We conclude that turning attention to relationship-centered theory and practice in health care holds promise for advancing care to a new level, with breakthroughs in quality of care, quality of life for those who provide it, and organizational performance.
Introduction to zeolite theory and modelling
Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.
2001-01-01
A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the
Prospect Theory in the Heterogeneous Agent Model
Czech Academy of Sciences Publication Activity Database
Polach, J.; Kukačka, Jiří
(2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economic s OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438.pdf
Recursive renormalization group theory based subgrid modeling
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
Diagrammatic group theory in quark models
International Nuclear Information System (INIS)
Canning, G.P.
1977-05-01
A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de
Spahn, Joanne M; Reeves, Rebecca S; Keim, Kathryn S; Laquatra, Ida; Kellogg, Molly; Jortberg, Bonnie; Clark, Nicole A
2010-06-01
Behavior change theories and models, validated within the field of dietetics, offer systematic explanations for nutrition-related behavior change. They are integral to the nutrition care process, guiding nutrition assessment, intervention, and outcome evaluation. The American Dietetic Association Evidence Analysis Library Nutrition Counseling Workgroup conducted a systematic review of peer-reviewed literature related to behavior change theories and strategies used in nutrition counseling. Two hundred fourteen articles were reviewed between July 2007 and March 2008, and 87 studies met the inclusion criteria. The workgroup systematically evaluated these articles and formulated conclusion statements and grades based upon the available evidence. Strong evidence exists to support the use of a combination of behavioral theory and cognitive behavioral theory, the foundation for cognitive behavioral therapy (CBT), in facilitating modification of targeted dietary habits, weight, and cardiovascular and diabetes risk factors. Evidence is particularly strong in patients with type 2 diabetes receiving intensive, intermediate-duration (6 to 12 months) CBT, and long-term (>12 months duration) CBT targeting prevention or delay in onset of type 2 diabetes and hypertension. Few studies have assessed the application of the transtheoretical model on nutrition-related behavior change. Little research was available documenting the effectiveness of nutrition counseling utilizing social cognitive theory. Motivational interviewing was shown to be a highly effective counseling strategy, particularly when combined with CBT. Strong evidence substantiates the effectiveness of self-monitoring and meal replacements and/or structured meal plans. Compelling evidence exists to demonstrate that financial reward strategies are not effective. Goal setting, problem solving, and social support are effective strategies, but additional research is needed in more diverse populations. Routine documentation
Brug, Hans; Oenema, Anke; Ferreira, Isabel
2005-01-01
Abstract Background The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. Discussion Since behavior theory is a reflection of the compiled evidence of behavior research, theory is the only foothold we have for the development of behavioral nutrition and physical activity interventions. Application of theory should improve the effectiveness of interventions. However, some of the the...
Load theory behind the wheel: an experimental application of a cognitive model to simulated driving
Murphy, Gillian
2017-01-01
Load Theory is a prominent model of selective attention first proposed over twenty years ago. Load Theory is supported by a great many experimental and neuroimaging studies. There is however, little evidence that Load Theory can be applied to real world attention, though it has great practical potential. Driving, as an everyday task where failures of attention can have profound consequences, stands to benefit from the understanding of selective attention that Load Theory provides. The aim of ...
Dissecting Practical Intelligence Theory: Its Claims and Evidence.
Gottfredson, Linda S.
2003-01-01
The two key theoretical propositions of "Practical Intelligence in Everyday Life" are made plausible only if one ignores considerable evidence contradicting them. The six key empirical claims rest primarily on the illusion of evidence enhanced by selective reporting of results. (SLD)
Semantic Modelling of Digital Forensic Evidence
Kahvedžić, Damir; Kechadi, Tahar
The reporting of digital investigation results are traditionally carried out in prose and in a large investigation may require successive communication of findings between different parties. Popular forensic suites aid in the reporting process by storing provenance and positional data but do not automatically encode why the evidence is considered important. In this paper we introduce an evidence management methodology to encode the semantic information of evidence. A structured vocabulary of terms, ontology, is used to model the results in a logical and predefined manner. The descriptions are application independent and automatically organised. The encoded descriptions aim to help the investigation in the task of report writing and evidence communication and can be used in addition to existing evidence management techniques.
Evidence for an expectancy-based theory of avoidance behaviour.
Declercq, Mieke; De Houwer, Jan; Baeyens, Frank
2008-01-01
In most studies on avoidance learning, participants receive an aversive unconditioned stimulus after a warning signal is presented, unless the participant performs a particular response. Lovibond (2006) recently proposed a cognitive theory of avoidance learning, according to which avoidance behaviour is a function of both Pavlovian and instrumental conditioning. In line with this theory, we found that avoidance behaviour was based on an integration of acquired knowledge about, on the one hand, the relation between stimuli and, on the other hand, the relation between behaviour and stimuli.
A dynamical theory for the Rishon model
International Nuclear Information System (INIS)
Harari, H.; Seiberg, N.
1980-09-01
We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)
Polyacetylene and relativistic field-theory models
International Nuclear Information System (INIS)
Bishop, A.R.; Campbell, D.K.; Fesser, K.
1981-01-01
Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed
van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie
2017-09-01
This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.
Condition Evaluation of Storage Equipment Based on Improved D-S Evidence Theory
Directory of Open Access Journals (Sweden)
Zhang Xiao-yu
2017-01-01
Full Text Available Assessment and prediction of the storage equipment’s condition is always a difficult aspect in PHM technology. The current Condition evaluation of equipment lacks of the state level, and a single test data can’t reflect the change of equipment’s state. To solve the problem, this paper proposes an evaluation method based on improved D-S evidence theory. Firstly, use analytic hierarchy process (AHP to establish a hierarchical structure model of equipment and divide the qualified state into 4 grades. Then respectively compare the test data with the last test value, historical test mean value and standard value. And the triangular fuzzy function to calculate the index membership degree, combined with D-S evidence theory to fuse information from multiple sources, to achieve such equipment real-time state assessment. Finally, the model is used to a servo mechanism. The result shows that this method has a good performance in condition evaluation for the storage equipment
Taxation of petroleum products: theory and empirical evidence
International Nuclear Information System (INIS)
Gupta, S.; Mahler, W.
1995-01-01
The domestic taxation of petroleum products is an important source of revenue in most countries. However, there is a wide variation of tax rates on petroleum products across countries, which cannot be explained by economic theory alone. This paper surveys different considerations advanced for taxing petroleum and presents petroleum tax rate data in 120countries. (author)
On the Role of Theory and Evidence in Macroeconomics
DEFF Research Database (Denmark)
Juselius, Katarina
This paper, which is prepared for the Inagural Conference of the Institute for New Economic Thinking in King's College, Cambridge, 8-11 April 2010, questions the preeminence of theory over empirics in economics and argues that empirical econometrics needs to be given a more important...
Working memory: theories, models, and controversies.
Baddeley, Alan
2012-01-01
I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.
Effective field theory and the quark model
International Nuclear Information System (INIS)
Durand, Loyal; Ha, Phuoc; Jaczko, Gregory
2001-01-01
We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections
Assessing landslide susceptibility by applying fuzzy sets, possibility evidence-based theories
Directory of Open Access Journals (Sweden)
Ibsen Chivatá Cárdenas
2008-01-01
Full Text Available A landslide susceptibility model was developed for the city of Manizales, Colombia; landslides have been the city’s main environmental problem. Fuzzy sets and possibility and evidence-based theories were used to construct the mo-del due to the set of circumstances and uncertainty involved in the modelling; uncertainty particularly concerned the lack of representative data and the need for systematically coordinating subjective information. Susceptibility and the uncertainty were estimated via data processing; the model contained data concerning mass vulnerability and uncer-tainty. Output data was expressed on a map defined by linguistic categories or uncertain labels as having low, me-dium, high and very high susceptibility; this was considered appropriate for representing susceptibility. A fuzzy spec-trum was developed for classifying susceptibility levels according to perception and expert opinion. The model sho-wed levels of susceptibility in the study area, ranging from low to high susceptibility (medium susceptibility being mo-re frequent. This article shows the details concerning systematic data processing by presenting theories and tools regarding uncertainty. The concept of fuzzy parameters is introduced; this is useful in modelling phenomena regar-ding uncertainty, complexity and nonlinear performance, showing that susceptibility modelling can be feasible. The paper also shows the great convenience of incorporating uncertainty into modelling and decision-making. However, quantifying susceptibility is not suitable when modelling identified uncertainty because incorporating model output information cannot be reduced into exact or real numerical quantities when the nature of the variables is particularly uncertain. The latter concept is applicable to risk assessment.
Directory of Open Access Journals (Sweden)
Carol A. Gordon
2009-06-01
Full Text Available Objective – Part I of this paper aims to create a framework for an emerging theory of evidence based information literacy instruction. In order to ground this framework in existing theory, a holistic perspective views inquiry as a learning process that synthesizes information searching and knowledge building. An interdisciplinary approach is taken to relate user-centric information behavior theory and constructivist learning theory that supports this synthesis. The substantive theories that emerge serve as a springboard for emerging theory. A second objective of this paper is to define evidence based information literacy instruction by assessing the suitability of performance based assessment and action research as tools of evidence based practice.Methods – An historical review of research grounded in user-centered information behavior theory and constructivist learning theory establishes a body of existing substantive theory that supports emerging theory for evidence based information literacy instruction within an information-to-knowledge approach. A focused review of the literature presents supporting research for an evidence based pedagogy that is performance assessment based, i.e., information users are immersed in real-world tasks that include formative assessments. An analysis of the meaning of action research in terms of its purpose and methodology establishes its suitability for structuring an evidence based pedagogy. Supporting research tests a training model for school librarians and educators which integrates performance based assessment, as well as action research. Results – Findings of an historical analysis of information behavior theory and constructivist teaching practices, and a literature review that explores teaching models for evidence based information literacy instruction, point to two elements of evidence based information literacy instruction: the micro level of information searching behavior and the macro level of
Topos models for physics and topos theory
International Nuclear Information System (INIS)
Wolters, Sander
2014-01-01
What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos
Prospects for advanced RF theory and modeling
International Nuclear Information System (INIS)
Batchelor, D. B.
1999-01-01
This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed. (c) 1999 American Institute of Physics
A Membrane Model from Implicit Elasticity Theory
Freed, A. D.; Liao, J.; Einstein, D. R.
2014-01-01
A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079
Strategic behavior and marriage payments: theory and evidence from Senegal.
Gaspart, Frederic; Platteau, Jean-Philippe
2010-01-01
This article proposes an original theory of marriage payments based on insights gained from firsthand information collected in the Senegal River valley. This theory postulates that decisions about the bride-price, which are made by the bride's father, take into account the likely effects of the amount set on the risk of ill-treatment of the wife and the risk of marriage failure. Based on a sequential game with three players (the bride's father, the husband, and the wife) and a matching process, it leads to a number of important predictions that are tested against Senegalese data relating to bride-prices and various characteristics of women. The empirical results confirm that parents behave strategically by keeping bride-prices down so as to reduce the risk of marriage failure for their daughters. Other interesting effects on marriage payments and the probability of separation are also highlighted, stressing the role of the bride's bargaining power in her own family.
Bonetti, Debbie; Johnston, Marie; Clarkson, Jan E; Grimshaw, Jeremy; Pitts, Nigel B; Eccles, Martin; Steen, Nick; Thomas, Ruth; Maclennan, Graeme; Glidewell, Liz; Walker, Anne
2010-04-08
Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs) in Scotland. Outcomes were behavioural simulation (scenario decision-making), and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model, and knowledge (a non-theoretical construct). Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value. Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT), timeline acute (CS-SRM), and outcome expectancy (SCT) entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT) and attitude (TPB) entered the equation, together explaining 68% of the variance in intention. The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for identifying factors that may predict clinical behaviour
Directory of Open Access Journals (Sweden)
Maclennan Graeme
2010-04-01
Full Text Available Abstract Background Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Methods Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs in Scotland. Outcomes were behavioural simulation (scenario decision-making, and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model, and knowledge (a non-theoretical construct. Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value Results Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT, timeline acute (CS-SRM, and outcome expectancy (SCT entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT and attitude (TPB entered the equation, together explaining 68% of the variance in intention. Summary The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for
Attribution models and the Cooperative Game Theory
Cano Berlanga, Sebastian; Vilella, Cori
2017-01-01
The current paper studies the attribution model used by Google Analytics. Precisely, we use the Cooperative Game Theory to propose a fair distribution of the revenues among the considered channels, in order to facilitate the cooperation and to guarantee stability. We define a transferable utility convex cooperative game from the observed frequencies and we use the Shapley value to allocate the revenues among the di erent channels. Furthermore, we evaluate the impact of an advertising...
MODELS AND THE DYNAMICS OF THEORIES
Directory of Open Access Journals (Sweden)
Paulo Abrantes
2007-12-01
Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.
Conceptual Models and Theory-Embedded Principles on Effective Schooling.
Scheerens, Jaap
1997-01-01
Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…
Synthetic Domain Theory and Models of Linear Abadi & Plotkin Logic
DEFF Research Database (Denmark)
Møgelberg, Rasmus Ejlers; Birkedal, Lars; Rosolini, Guiseppe
2008-01-01
Plotkin suggested using a polymorphic dual intuitionistic/linear type theory (PILLY) as a metalanguage for parametric polymorphism and recursion. In recent work the first two authors and R.L. Petersen have defined a notion of parametric LAPL-structure, which are models of PILLY, in which one can...... reason using parametricity and, for example, solve a large class of domain equations, as suggested by Plotkin.In this paper, we show how an interpretation of a strict version of Bierman, Pitts and Russo's language Lily into synthetic domain theory presented by Simpson and Rosolini gives rise...... to a parametric LAPL-structure. This adds to the evidence that the notion of LAPL-structure is a general notion, suitable for treating many different parametric models, and it provides formal proofs of consequences of parametricity expected to hold for the interpretation. Finally, we show how these results...
Global Sourcing of Heterogeneous Firms: Theory and Evidence
DEFF Research Database (Denmark)
Kohler, Wilhelm; Smolka, Marcel
2015-01-01
The share of international trade within firm boundaries varies greatly across countries. This column presents new evidence on how the productivity of a firm affects the choice between vertical integration and outsourcing, as well as between foreign and domestic sourcing. The productivity effects...
Blue Ocean versus Competitive Strategy: Theory and Evidence
A.E. Burke (Andrew); A.J. van Stel (André); A.R. Thurik (Roy)
2009-01-01
textabstractBlue ocean strategy seeks to turn strategic management on its head by replacing ‘competitive advantage’ with ‘value innovation’ as the primary goal where firms must create consumer demand and exploit untapped markets. Empirical analysis has been focused on case study evidence and so
Finite Unification: Theory, Models and Predictions
Heinemeyer, S; Zoupanos, G
2011-01-01
All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...
Current Account Adjustment: Some New Theory and Evidence
Jiandong Ju; Shang-Jin Wei
2007-01-01
This paper aims to provide a theory of current account adjustment that generalizes the textbook version of the intertemporal approach to current account and places domestic labor market institutions at the center stage. In general, in response to a shock, an economy adjusts through a combination of a change in the composition of goods trade (i.e., intra-temporal trade channel) and a change in the current account (i.e., intertemporal trade channel). The more rigid the labor market, the slower ...
Theory, modeling and simulation: Annual report 1993
Energy Technology Data Exchange (ETDEWEB)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.
Theory, modeling and simulation: Annual report 1993
International Nuclear Information System (INIS)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies
Women in the Workplace and Management Practices: Theory and Evidence
Kato, Takao; Kodama, Naomi
2017-01-01
We review recent studies on management practices and their consequences for women in the workplace. First, the High Performance Work System (HPWS) is associated with greater gender diversity in the workplace while there is little evidence that the HPWS reduces the gender pay gap. Second, work-life balance practices with limited face-to-face interactions with coworkers may hamper women’s career advancement. Third, individual incentive linking pay to objective performance may enhance gender div...
Greenslade, Kathryn J.; Coggins, Truman E.
2016-01-01
This study presents an independent replication and extension of psychometric evidence supporting the "Theory of Mind Inventory" ("ToMI"). Parents of 20 children with ASD (4; 1-6; 7 years; months) and 20 with typical development (3; 1-6; 5), rated their child's theory of mind abilities in everyday situations. Other parent report…
What is the Role of Legal Systems in Financial Intermediation? Theory and Evidence
Bottazzi, L.; Da Rin, M.; Hellmann, T.
2008-01-01
We develop a theory and empirical test of how the legal system affects the relationship between venture capitalists and entrepreneurs. The theory uses a double moral hazard framework to show how optimal contracts and investor actions depend on the quality of the legal system. The empirical evidence
International Nuclear Information System (INIS)
Randjbar-Daemi, S.
1987-01-01
The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix
Directory of Open Access Journals (Sweden)
Glidewell Elizabeth
2007-08-01
Full Text Available Abstract Background Psychological models can be used to understand and predict behaviour in a wide range of settings. However, they have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. The aim of this study was to explore the usefulness of a range of psychological theories to predict health professional behaviour relating to management of upper respiratory tract infections (URTIs without antibiotics. Methods Psychological measures were collected by postal questionnaire survey from a random sample of general practitioners (GPs in Scotland. The outcome measures were clinical behaviour (using antibiotic prescription rates as a proxy indicator, behavioural simulation (scenario-based decisions to managing URTI with or without antibiotics and behavioural intention (general intention to managing URTI without antibiotics. Explanatory variables were the constructs within the following theories: Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-Regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model (SM, and knowledge (a non-theoretical construct. For each outcome measure, multiple regression analysis was used to examine the predictive value of each theoretical model individually. Following this 'theory level' analysis, a 'cross theory' analysis was conducted to investigate the combined predictive value of all significant individual constructs across theories. Results All theories were tested, but only significant results are presented. When predicting behaviour, at the theory level, OLT explained 6% of the variance and, in a cross theory analysis, OLT 'evidence of habitual behaviour' also explained 6%. When predicting behavioural simulation, at the theory level, the proportion of variance explained was: TPB, 31%; SCT, 26%; II, 6%; OLT, 24%. GPs who reported having already decided to change their management to
An Econometric Validation of Malthusian Theory: Evidence in Nigeria
Directory of Open Access Journals (Sweden)
Musa Abdullahi Sakanko
2018-01-01
Full Text Available Rising population is an asset, provided, the skills of the workforce are used to the maximum extent. If not appropriately channelized, it can be a liability for a nation. A skilled and hardworking population can emerge as a foundation for a country’s development. This study examines the validity of Malthusian Theory in Nigeria using time series data from 1960 to 2016, employs the ARDL bound test techniques. The result shows that in the long-run, population growth and food production move proportionately, while population growth poses a depleting effect on food production in the short-run, thus validating the incidence of Malthusian impact in Nigerian economy in the short-run. The researcher recommended the government should strategize plans, which will further intensify family planning and birth control measure, compulsory western education and revitalization of the agricultural sector.DOI: 10.150408/sjie.v7i1.6461
Economic theory and evidence on smoking behavior of adults.
Sloan, Frank A; Wang, Yang
2008-11-01
To describe: (i) three alternative conceptual frameworks used by economists to study addictive behaviors: rational, imperfectly rational and irrational addiction; (ii) empirical economic evidence on each framework and specific channels to explain adult smoking matched to the frameworks; and (iii) policy implications for each framework. A systematic review and appraisal of important theoretical and empirical economic studies on smoking. There is some empirical support for each framework. For rational and imperfectly rational addiction there is some evidence that anticipated future cigarette prices influence current cigarette consumption, and quitting costs are high for smokers. Smokers are more risk-tolerant in the financial domain than are others and tend to attach a lower value to being in good health. Findings on differences in rates of time preference by smoking status are mixed; however, short-term rates are higher than long-term rates for both smokers and non-smokers, a stylized fact consistent with hyperbolic discounting. The economic literature lends no empirical support to the view that mature adults smoke because they underestimate the probability of harm to health from smoking. In support of the irrationality framework, smokers tend to be more impulsive than others in domains not related directly to smoking, implying that they may be sensitive to cues that trigger smoking. Much promising economic research uses the imperfectly rational addiction framework, but empirical research based on this framework is still in its infancy.
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.
Quantum integrable models of field theory
International Nuclear Information System (INIS)
Faddeev, L.D.
1979-01-01
Fundamental features of the classical method of the inverse problem have been formulated in the form which is convenient for its quantum reformulation. Typical examples are studied which may help to formulate the quantum method of the inverse problem. Examples are considered for interaction with both attraction and repulsion at a final density. The sine-Gordon model and the XYZ model from the quantum theory of magnetics are examined in short. It is noted that all the achievements of the one-dimensional mathematical physics as applied to exactly solvable quantum models may be put to an extent within the framework of the quantum method of the inverse problem. Unsolved questions are enumerated and perspectives of applying the inverse problem method are shown
Theory and Model for Martensitic Transformations
DEFF Research Database (Denmark)
Lindgård, Per-Anker; Mouritsen, Ole G.
1986-01-01
Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions....
Profit Sharing and Reciprocity: Theory and Survey Evidence
Cornelissen, Thomas; Heywood, John S.; Jirjahn, Uwe
2010-01-01
The 1/n problem potentially limits the effectiveness of profit sharing in motivating workers. While the economic literature suggests that reciprocity can mitigate this problem, it remains silent on the optimal degree of reciprocity. We present a representative model demonstrating that reciprocity may increase productive effort but may also increase unproductive effort such as socializing on the job. The model implies that reciprocity increases profit up to a point but decreases profit beyond ...
Directory of Open Access Journals (Sweden)
Ferreira Isabel
2005-04-01
Full Text Available Abstract Background The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. Discussion Since behavior theory is a reflection of the compiled evidence of behavior research, theory is the only foothold we have for the development of behavioral nutrition and physical activity interventions. Application of theory should improve the effectiveness of interventions. However, some of the theories we use lack a strong empirical foundation, and the available theories are not always used in the most effective way. Furthermore, many of the commonly-used theories provide at best information on what needs to be changed to promote healthy behavior, but not on how changes can be induced. Finally, many theories explain behavioral intentions or motivation rather well, but are less well-suited to explaining or predicting actual behavior or behavior change. For more effective interventions, behavior change theory needs to be further developed in stronger research designs and such change-theory should especially focus on how to promote action rather than mere motivation. Since voluntary behavior change requires motivation, ability as well as the opportunity to change, further development of behavior change theory should incorporate environmental change strategies. Conclusion Intervention Mapping may help to further improve the application of theories in nutrition and physical activity behavior change.
Economic contract theory tests models of mutualism.
Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E
2010-09-07
Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.
Magnetic flux tube models in superstring theory
Russo, Jorge G
1996-01-01
Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...
Group theory for unified model building
International Nuclear Information System (INIS)
Slansky, R.
1981-01-01
The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)
Statistical polarization in greenhouse gas emissions: Theory and evidence.
Remuzgo, Lorena; Trueba, Carmen
2017-11-01
The current debate on climate change is over whether global warming can be limited in order to lessen its impacts. In this sense, evidence of a decrease in the statistical polarization in greenhouse gas (GHG) emissions could encourage countries to establish a stronger multilateral climate change agreement. Based on the interregional and intraregional components of the multivariate generalised entropy measures (Maasoumi, 1986), Gigliarano and Mosler (2009) proposed to study the statistical polarization concept from a multivariate view. In this paper, we apply this approach to study the evolution of such phenomenon in the global distribution of the main GHGs. The empirical analysis has been carried out for the time period 1990-2011, considering an endogenous grouping of countries (Aghevli and Mehran, 1981; Davies and Shorrocks, 1989). Most of the statistical polarization indices showed a slightly increasing pattern that was similar regardless of the number of groups considered. Finally, some policy implications are commented. Copyright © 2017 Elsevier Ltd. All rights reserved.
A matrix model from string field theory
Directory of Open Access Journals (Sweden)
Syoji Zeze
2016-09-01
Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.
Visceral obesity and psychosocial stress: a generalised control theory model
Wallace, Rodrick
2016-07-01
The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.
Resconi, Germano; Klir, George J.; Pessa, Eliano
Recognizing that syntactic and semantic structures of classical logic are not sufficient to understand the meaning of quantum phenomena, we propose in this paper a new interpretation of quantum mechanics based on evidence theory. The connection between these two theories is obtained through a new language, quantum set theory, built on a suggestion by J. Bell. Further, we give a modal logic interpretation of quantum mechanics and quantum set theory by using Kripke's semantics of modal logic based on the concept of possible worlds. This is grounded on previous work of a number of researchers (Resconi, Klir, Harmanec) who showed how to represent evidence theory and other uncertainty theories in terms of modal logic. Moreover, we also propose a reformulation of the many-worlds interpretation of quantum mechanics in terms of Kripke's semantics. We thus show how three different theories — quantum mechanics, evidence theory, and modal logic — are interrelated. This opens, on one hand, the way to new applications of quantum mechanics within domains different from the traditional ones, and, on the other hand, the possibility of building new generalizations of quantum mechanics itself.
International Migration with Heterogeneous Agents: Theory and Evidence
DEFF Research Database (Denmark)
Schröder, Philipp J.H.; Brücker, Herbert
Temporary migration, though empirically relevant, is often ignored in formal models. This paper proposes a migration model with heterogeneous agents and persistent cross country income differentials that features temporary migration. In equilibrium there exists a positive relation between the stock...... of migrants and the income differential, while the net migration flow becomes zero. Consequently, existing empirical migration models, estimating net migration flows, instead of stocks, may be misspecified. This suspicion appears to be confirmed by our investigation of the cointegration relationships...... of German migration stocks and flows since 1967. We find that (i) panel-unit root tests reject the hypothesis that migration flows and the explanatory variables are integrated of the same order, while migration stocks and the explanatory variables are all I(1) variables, and (ii) the hypothesis...
Directory of Open Access Journals (Sweden)
Qian Ding
2014-10-01
Full Text Available Multi-sensor and information fusion technology based on Dempster-Shafer evidence theory is applied in the system of a building fire alarm to realize early detecting and alarming. By using a multi-sensor to monitor the parameters of the fire process, such as light, smoke, temperature, gas and moisture, the range of fire monitoring in space and time is expanded compared with a single-sensor system. Then, the D-S evidence theory is applied to fuse the information from the multi-sensor with the specific fire model, and the fire alarm is more accurate and timely. The proposed method can avoid the failure of the monitoring data effectively, deal with the conflicting evidence from the multi-sensor robustly and improve the reliability of fire warning significantly.
Wakefield, Jerome C
2007-01-01
Bowlby (1973), applying attachment theory to Freud's case of Little Hans, hypothesized that Hans's anxiety was a manifestation of anxious attachment. However Bowlby's evidence was modest; Hans was threatened by his mother with abandonment, expressed fear of abandonment prior to symptom onset, and was separated from his mother for a short time a year before. Bowlby's hypothesis is reassessed in light of a systematic review of the case record as well as new evidence from recently derestricted interviews with Hans's father and Hans in the Freud Archives. Bowlby's hypothesis is supported by multiple additional lines of evidence regarding both triggers of separation anxiety preceding the phobia (e.g., a funeral, sibling rivalry, moving, getting his own bedroom) and background factors influencing his working model of attachment (mother's psychopathology, intense marital conflict, multiple suicides in mother's family) that would make him more vulnerable to such anxiety. Bowlby's hypothesis is also placed within the context of subsequent developments in attachment theory.
On low rank classical groups in string theory, gauge theory and matrix models
International Nuclear Information System (INIS)
Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun
2004-01-01
We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature
Mobile Money, Trade Credit and Economic Development : Theory and Evidence
Beck, T.H.L.; Pamuk, H.; Uras, R.B.; Ramrattan, R.
2015-01-01
Using a novel enterprise survey from Kenya (FinAccess Business), we document a strong positive association between the use of mobile money as a method to pay suppliers and access to trade credit. We develop a dynamic general equilibrium model with heterogeneous entrepreneurs, imperfect credit
Interntional Migration with Heterogeneous Agents: Theory and Evidence
DEFF Research Database (Denmark)
Schröder, Philipp J.H.; Brücker, Herbert
Two puzzling facts of international migration are that only a small share of a sending country's population emigrates and that net migration rates tend to cease over time. This paper addresses these issues in a migration model with heterogeneous agents that features temporary migration...
Financial Structure and Macroeconomic Volatility : Theory and Evidence
Huizinga, H.P.; Zhu, D.
2006-01-01
This paper presents a simple model capturing differences between debt and equity finance to examine how financial structure matters for macroeconomic volatility. Debt finance is relatively cheap in the sense that debt holders need to verify relatively few profitability states, but debt finance may
Allocation of Students in Public Schools: Theory and New Evidence
Cohen-Zada, Danny; Gradstein, Mark; Reuven, Ehud
2013-01-01
The allocation of educational resources to students of different socio-economic backgrounds has important policy implications since it affects individual educational outcomes as well as the future distribution of human capital. In this paper, we present a theoretical model showing that local school administrators have an incentive to allocate…
Chern-Simons Theory, Matrix Models, and Topological Strings
International Nuclear Information System (INIS)
Walcher, J
2006-01-01
This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may
Delagran, Louise; Vihstadt, Corrie; Evans, Roni
2015-09-01
Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development.
Statistical polarization in greenhouse gas emissions: Theory and evidence
International Nuclear Information System (INIS)
Remuzgo, Lorena; Trueba, Carmen
2017-01-01
The current debate on climate change is over whether global warming can be limited in order to lessen its impacts. In this sense, evidence of a decrease in the statistical polarization in greenhouse gas (GHG) emissions could encourage countries to establish a stronger multilateral climate change agreement. Based on the interregional and intraregional components of the multivariate generalised entropy measures (Maasoumi, 1986), Gigliarano and Mosler (2009) proposed to study the statistical polarization concept from a multivariate view. In this paper, we apply this approach to study the evolution of such phenomenon in the global distribution of the main GHGs. The empirical analysis has been carried out for the time period 1990–2011, considering an endogenous grouping of countries (Aghevli and Mehran, 1981; Davies and Shorrocks, 1989). Most of the statistical polarization indices showed a slightly increasing pattern that was similar regardless of the number of groups considered. Finally, some policy implications are commented. - Highlights: • We study the evolution of global polarization in GHG emissions. • We consider the four main GHGs: CO2, CH4, N2O and F-gases. • We use the multidimensional polarization indices (). • We consider an endogenous grouping of countries (). • Most of the polarization indices showed a slightly increasing pattern.
Singing for respiratory health: theory, evidence and challenges.
Gick, Mary L; Nicol, Jennifer J
2016-09-01
The premise that singing is a health promoting activity for people with respiratory conditions of chronic obstructive pulmonary disease (COPD) and asthma is a growing area of interest being investigated by researchers from various disciplines. The preliminary evidence, a theoretical framework and identification of methodological challenges are discussed in this perspective article with an eye to recommendations for further research to advance knowledge. After a brief summary of main research findings on singing in healthy people to provide background context, research is reviewed on singing in people with COPD and asthma. Studies include published research and as yet unpublished work by the authors. Methodological challenges arising from the reviewed studies are identified such as attrition from singing or control groups based on weak and strong, respectively, beliefs about singing's effectiveness. Potential solutions for these problems are considered with further recommendations made for other singing research. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Social capital: theory, evidence, and implications for oral health.
Rouxel, Patrick L; Heilmann, Anja; Aida, Jun; Tsakos, Georgios; Watt, Richard G
2015-04-01
In the last two decades, there has been increasing application of the concept of social capital in various fields of public health, including oral health. However, social capital is a contested concept with debates on its definition, measurement, and application. This study provides an overview of the concept of social capital, highlights the various pathways linking social capital to health, and discusses the potential implication of this concept for health policy. An extensive and diverse international literature has examined the relationship between social capital and a range of general health outcomes across the life course. A more limited but expanding literature has also demonstrated the potential influence of social capital on oral health. Much of the evidence in relation to oral health is limited by methodological shortcomings mainly related to the measurement of social capital, cross-sectional study designs, and inadequate controls for confounding factors. Further research using stronger methodological designs should explore the role of social capital in oral health and assess its potential application in the development of oral health improvement interventions. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Application of Chaos Theory to Psychological Models
Blackerby, Rae Fortunato
This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in
THE AGGREGATE IMPLICATIONS OF MACHINE REPLACEMENT: THEORY AND EVIDENCE
John Haltiwanger; Russell Cooper
1992-01-01
The authors study an economy in which producers incur resource costs to replace depreciated machines. The process of costly replacement and depreciation creates endogenous fluctuations in productivity, employment, and output of a single producer. The authors explore the spillover effects of machine replacement on other sectors of the economy and provide conditions for synchronized machine replacement by multiple independent producers. The implications of their model are generally consistent w...
Stock portfolio selection using Dempster–Shafer evidence theory
Mitra Thakur, Gour Sundar; Bhattacharyya, Rupak; Sarkar (Mondal), Seema
2016-01-01
Markowitz’s return–risk model for stock portfolio selection is based on the historical return data of assets. In addition to the effect of historical return, there are many other critical factors which directly or indirectly influence the stock market. We use the fuzzy Delphi method to identify the critical factors initially. Factors having lower correlation coefficients are finally considered for further consideration. The critical factors and historical data are used to apply Dempster–Shafe...
Gender Discrimination and Growth: Theory and Evidence from India
Berta Esteve-Volart
2004-01-01
Gender inequality is an acute and persistent problem, especially in developing countries. This paper argues that gender discrimination is an inefficient practice. We model gender discrimination as the complete exclusion of females from the labor market or as the exclusion of females from managerial positions. The distortions in the allocation of talent between managerial and unskilled positions, and in human capital investment, are analyzed. It is found that both types of discrimination lower...
Intangible Capital and Corporate Cash Holdings: Theory and Evidence
Dalida Kadyrzhanova; Antonio Falato; Jae Sim
2012-01-01
The rise in intangible capital is a fundamental driver of the secular trend in US corporate cash holdings over the last decades. We construct a new measure of intangible capital and show that intangible capital is the most important firm-level determinant of corporate cash holdings. Our measure accounts for almost as much of the secular increase in cash since the 1980s as all other standard determinants together. We then develop a new model of corporate cash holdings that introduces intangibl...
Elson, Edward
2009-01-01
A theory of control of cellular proliferation and differentiation in the early development of metazoan systems, postulating a system of electrical controls "parallel" to the processes of molecular biochemistry, is presented. It is argued that the processes of molecular biochemistry alone cannot explain how a developing organism defies a stochastic universe. The demonstration of current flow (charge transfer) along the long axis of DNA through the base-pairs (the "pi-way) in vitro raises the question of whether nature may employ such current flows for biological purposes. Such currents might be too small to be accessible to direct measurement in vivo but conduction has been measured in vitro, and the methods might well be extended to living systems. This has not been done because there is no reasonable model which could stimulate experimentation. We suggest several related, but detachable or independent, models for the biological utility of charge transfer, whose scope admittedly outruns current concepts of thinking about organization, growth, and development in eukaryotic, metazoan systems. The ideas are related to explanations proposed to explain the effects demonstrated on tumors and normal tissues described in Article I (this issue). Microscopic and mesoscopic potential fields and currents are well known at sub-cellular, cellular, and organ systems levels. Not only are such phenomena associated with internal cellular membranes in bioenergetics and information flow, but remarkable long-range fields over tissue interfaces and organs appear to play a role in embryonic development (Nuccitelli, 1992 ). The origin of the fields remains unclear and is the subject of active investigation. We are proposing that similar processes could play a vital role at a "sub-microscopic level," at the level of the chromosomes themselves, and could play a role in organizing and directing fundamental processes of growth and development, in parallel with the more discernible fields and
PARFUME Theory and Model basis Report
Energy Technology Data Exchange (ETDEWEB)
Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson
2009-09-01
The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.
Stevens, Tyler; Conwell, Darwin L; Zuccaro, Gregory
2004-11-01
In the past several decades, four prominent theories of chronic pancreatitis pathogenesis have emerged: the toxic-metabolic theory, the oxidative stress hypothesis, the stone and duct obstruction theory, and the necrosis-fibrosis hypothesis. Although these traditional theories are formulated based on compelling scientific observations, substantial contradictory data also exist for each. Furthermore, the basic premises of some of these theories are directly contradictory. Because of the recent scientific progress in the underlying genetic, cellular, and molecular pathophysiology, there have been substantial advances in the understanding of chronic pancreatitis pathogenesis. This paper will provide an evidence-based review and critique of the traditional pathogenic theories, followed by a discussion of the new advances in pancreatic fibrogenesis. Moreover, we will discuss plausible pathogenic sequences applied to each of the known etiologies.
Stochastic linear programming models, theory, and computation
Kall, Peter
2011-01-01
This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...
How Often Is the Misfit of Item Response Theory Models Practically Significant?
Sinharay, Sandip; Haberman, Shelby J.
2014-01-01
Standard 3.9 of the Standards for Educational and Psychological Testing ([, 1999]) demands evidence of model fit when item response theory (IRT) models are employed to data from tests. Hambleton and Han ([Hambleton, R. K., 2005]) and Sinharay ([Sinharay, S., 2005]) recommended the assessment of practical significance of misfit of IRT models, but…
Band, Rebecca; Bradbury, Katherine; Morton, Katherine; May, Carl; Michie, Susan; Mair, Frances S; Murray, Elizabeth; McManus, Richard J; Little, Paul; Yardley, Lucy
2017-02-23
This paper describes the intervention planning process for the Home and Online Management and Evaluation of Blood Pressure (HOME BP), a digital intervention to promote hypertension self-management. It illustrates how a Person-Based Approach can be integrated with theory- and evidence-based approaches. The Person-Based Approach to intervention development emphasises the use of qualitative research to ensure that the intervention is acceptable, persuasive, engaging and easy to implement. Our intervention planning process comprised two parallel, integrated work streams, which combined theory-, evidence- and person-based elements. The first work stream involved collating evidence from a mixed methods feasibility study, a systematic review and a synthesis of qualitative research. This evidence was analysed to identify likely barriers and facilitators to uptake and implementation as well as design features that should be incorporated in the HOME BP intervention. The second work stream used three complementary approaches to theoretical modelling: developing brief guiding principles for intervention design, causal modelling to map behaviour change techniques in the intervention onto the Behaviour Change Wheel and Normalisation Process Theory frameworks, and developing a logic model. The different elements of our integrated approach to intervention planning yielded important, complementary insights into how to design the intervention to maximise acceptability and ease of implementation by both patients and health professionals. From the primary and secondary evidence, we identified key barriers to overcome (such as patient and health professional concerns about side effects of escalating medication) and effective intervention ingredients (such as providing in-person support for making healthy behaviour changes). Our guiding principles highlighted unique design features that could address these issues (such as online reassurance and procedures for managing concerns). Causal
The Economic Importance of Financial Literacy: Theory and Evidence.
Lusardi, Annamaria; Mitchell, Olivia S
2014-03-01
This paper undertakes an assessment of a rapidly growing body of economic research on financial literacy. We start with an overview of theoretical research which casts financial knowledge as a form of investment in human capital. Endogenizing financial knowledge has important implications for welfare as well as policies intended to enhance levels of financial knowledge in the larger population. Next, we draw on recent surveys to establish how much (or how little) people know and identify the least financially savvy population subgroups. This is followed by an examination of the impact of financial literacy on economic decision-making in the United States and elsewhere. While the literature is still young, conclusions may be drawn about the effects and consequences of financial illiteracy and what works to remedy these gaps. A final section offers thoughts on what remains to be learned if researchers are to better inform theoretical and empirical models as well as public policy.
A Synthetic Fusion Rule for Salient Region Detection under the Framework of DS-Evidence Theory
Directory of Open Access Journals (Sweden)
Naeem Ayoub
2018-05-01
Full Text Available Saliency detection is one of the most valuable research topics in computer vision. It focuses on the detection of the most significant objects/regions in images and reduces the computational time cost of getting the desired information from salient regions. Local saliency detection or common pattern discovery schemes were actively used by the researchers to overcome the saliency detection problems. In this paper, we propose a bottom-up saliency fusion method by taking into consideration the importance of the DS-Evidence (Dempster–Shafer (DS theory. Firstly, we calculate saliency maps from different algorithms based on the pixels-level, patches-level and region-level methods. Secondly, we fuse the pixels based on the foreground and background information under the framework of DS-Evidence theory (evidence theory allows one to combine evidence from different sources and arrive at a degree of belief that takes into account all the available evidence. The development inclination of image saliency detection through DS-Evidence theory gives us better results for saliency prediction. Experiments are conducted on the publicly available four different datasets (MSRA, ECSSD, DUT-OMRON and PASCAL-S. Our saliency detection method performs well and shows prominent results as compared to the state-of-the-art algorithms.
Taking Root: a grounded theory on evidence-based nursing implementation in China.
Cheng, L; Broome, M E; Feng, S; Hu, Y
2018-06-01
Evidence-based nursing is widely recognized as the critical foundation for quality care. To develop a middle-range theory on the process of evidence-based nursing implementation in Chinese context. A grounded theory study using unstructured in-depth individual interviews was conducted with 56 participants who were involved in 24 evidence-based nursing implementation projects in Mainland China from September 2015 to September 2016. A middle-range grounded theory of 'Taking Root' was developed. The theory describes the evidence implementation process consisting of four components (driving forces, process, outcome, sustainment/regression), three approaches (top-down, bottom-up and outside-in), four implementation strategies (patient-centred, nurses at the heart of change, reaching agreement, collaboration) and two patterns (transformational and adaptive implementation). Certain perspectives may have not been captured, as the retrospective nature of the interviewing technique did not allow for 'real-time' assessment of the actual implementation process. The transferability of the findings requires further exploration as few participants with negative experiences were recruited. This is the first study that explored evidence-based implementation process, strategies, approaches and patterns in the Chinese nursing practice context to inform international nursing and health policymaking. The theory of Taking Root described various approaches to evidence implementation and how the implementation can be transformational for the nurses and the setting in which they work. Nursing educators, managers and researchers should work together to improve nurses' readiness for evidence implementation. Healthcare systems need to optimize internal mechanisms and external collaborations to promote nursing practice in line with evidence and achieve clinical outcomes and sustainability. © 2017 International Council of Nurses.
The pipe model theory half a century on: a review.
Lehnebach, Romain; Beyer, Robert; Letort, Véronique; Heuret, Patrick
2018-01-23
More than a half century ago, Shinozaki et al. (Shinozaki K, Yoda K, Hozumi K, Kira T. 1964b. A quantitative analysis of plant form - the pipe model theory. II. Further evidence of the theory and its application in forest ecology. Japanese Journal of Ecology14: 133-139) proposed an elegant conceptual framework, the pipe model theory (PMT), to interpret the observed linear relationship between the amount of stem tissue and corresponding supported leaves. The PMT brought a satisfactory answer to two vividly debated problems that were unresolved at the moment of its publication: (1) What determines tree form and which rules drive biomass allocation to the foliar versus stem compartments in plants? (2) How can foliar area or mass in an individual plant, in a stand or at even larger scales be estimated? Since its initial formulation, the PMT has been reinterpreted and used in applications, and has undoubtedly become an important milestone in the mathematical interpretation of plant form and functioning. This article aims to review the PMT by going back to its initial formulation, stating its explicit and implicit properties and discussing them in the light of current biological knowledge and experimental evidence in order to identify the validity and range of applicability of the theory. We also discuss the use of the theory in tree biomechanics and hydraulics as well as in functional-structural plant modelling. Scrutinizing the PMT in the light of modern biological knowledge revealed that most of its properties are not valid as a general rule. The hydraulic framework derived from the PMT has attracted much more attention than its mechanical counterpart and implies that only the conductive portion of a stem cross-section should be proportional to the supported foliage amount rather than the whole of it. The facts that this conductive portion is experimentally difficult to measure and varies with environmental conditions and tree ontogeny might cause the commonly
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2017-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Theory and modelling of nanocarbon phase stability.
Energy Technology Data Exchange (ETDEWEB)
Barnard, A. S.
2006-01-01
The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2015-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Game Theory and its Relationship with Linear Programming Models ...
African Journals Online (AJOL)
Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.
Hosotani model in closed string theory
International Nuclear Information System (INIS)
Shiraishi, Kiyoshi.
1988-11-01
Hosotani mechanism in the closed string theory with current algebra symmetry is described by the (old covariant) operator method. We compare the gauge symmetry breaking mechanism in a string theory which has SU(2) symmetry with the one in an equivalent compactified closed string theory. We also investigate the difference between Hosotani mechanism and Higgs mechanism in closed string theories by calculation of a fourpoint amplitude of 'Higgs' bosons at tree level. (author)
Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang
2017-08-28
Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.
The Properties of Model Selection when Retaining Theory Variables
DEFF Research Database (Denmark)
Hendry, David F.; Johansen, Søren
Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....
System Dynamics as Model-Based Theory Building
Schwaninger, Markus; Grösser, Stefan N.
2008-01-01
This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...
A Realizability Model for Impredicative Hoare Type Theory
DEFF Research Database (Denmark)
Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar
2008-01-01
We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....
Irreducible integrable theories form tensor products of conformal models
International Nuclear Information System (INIS)
Mathur, S.D.; Warner, N.P.
1991-01-01
By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)
Standard model and chiral gauge theories on the lattice
International Nuclear Information System (INIS)
Smit, J.
1990-01-01
A review is given of developments in lattice formulations of chiral gauge theories. There is now evidence that the unwanted fermion doublers can be decoupled satisfactorily by giving them masses of the order of the cutoff. (orig.)
Firm Size, a Self-Organized Critical Phenomenon: Evidence from the Dynamical Systems Theory
Chandra, Akhilesh
This research draws upon a recent innovation in the dynamical systems literature called the theory of self -organized criticality (SOC) (Bak, Tang, and Wiesenfeld 1988) to develop a computational model of a firm's size by relating its internal and the external sub-systems. As a holistic paradigm, the theory of SOC implies that a firm as a composite system of many degrees of freedom naturally evolves to a critical state in which a minor event starts a chain reaction that can affect either a part or the system as a whole. Thus, the global features of a firm cannot be understood by analyzing its individual parts separately. The causal framework builds upon a constant capital resource to support a volume of production at the existing level of efficiency. The critical size is defined as the production level at which the average product of a firm's factors of production attains its maximum value. The non -linearity is inferred by a change in the nature of relations at the border of criticality, between size and the two performance variables, viz., the operating efficiency and the financial efficiency. The effect of breaching the critical size is examined on the stock price reactions. Consistent with the theory of SOC, it is hypothesized that the temporal response of a firm breaching the level of critical size should behave as a flicker noise (1/f) process. The flicker noise is characterized by correlations extended over a wide range of time scales, indicating some sort of cooperative effect among a firm's degrees of freedom. It is further hypothesized that a firm's size evolves to a spatial structure with scale-invariant, self-similar (fractal) properties. The system is said to be self-organized inasmuch as it naturally evolves to the state of criticality without any detailed specifications of the initial conditions. In this respect, the critical state is an attractor of the firm's dynamics. Another set of hypotheses examines the relations between the size and the
Directory of Open Access Journals (Sweden)
Kaner Eileen FS
2008-01-01
Full Text Available Abstract Background Psychological theories of behaviour may provide a framework to guide the design of interventions to change professional behaviour. Behaviour change interventions, designed using psychological theory and targeting important motivational beliefs, were experimentally evaluated for effects on the behavioural intention and simulated behaviour of GPs in the management of uncomplicated upper respiratory tract infection (URTI. Methods The design was a 2 × 2 factorial randomised controlled trial. A postal questionnaire was developed based on three theories of human behaviour: Theory of Planned Behaviour; Social Cognitive Theory and Operant Learning Theory. The beliefs and attitudes of GPs regarding the management of URTI without antibiotics and rates of prescribing on eight patient scenarios were measured at baseline and post-intervention. Two theory-based interventions, a "graded task" with "action planning" and a "persuasive communication", were incorporated into the post-intervention questionnaire. Trial groups were compared using co-variate analyses. Results Post-intervention questionnaires were returned for 340/397 (86% GPs who responded to the baseline survey. Each intervention had a significant effect on its targeted behavioural belief: compared to those not receiving the intervention GPs completing Intervention 1 reported stronger self-efficacy scores (Beta = 1.41, 95% CI: 0.64 to 2.25 and GPs completing Intervention 2 had more positive anticipated consequences scores (Beta = 0.98, 95% CI = 0.46 to 1.98. Intervention 2 had a significant effect on intention (Beta = 0.90, 95% CI = 0.41 to 1.38 and simulated behaviour (Beta = 0.47, 95% CI = 0.19 to 0.74. Conclusion GPs' intended management of URTI was significantly influenced by their confidence in their ability to manage URTI without antibiotics and the consequences they anticipated as a result of doing so. Two targeted behaviour change interventions differentially affected
Hrisos, Susan; Eccles, Martin; Johnston, Marie; Francis, Jill; Kaner, Eileen FS; Steen, Nick; Grimshaw, Jeremy
2008-01-01
Background Psychological theories of behaviour may provide a framework to guide the design of interventions to change professional behaviour. Behaviour change interventions, designed using psychological theory and targeting important motivational beliefs, were experimentally evaluated for effects on the behavioural intention and simulated behaviour of GPs in the management of uncomplicated upper respiratory tract infection (URTI). Methods The design was a 2 × 2 factorial randomised controlled trial. A postal questionnaire was developed based on three theories of human behaviour: Theory of Planned Behaviour; Social Cognitive Theory and Operant Learning Theory. The beliefs and attitudes of GPs regarding the management of URTI without antibiotics and rates of prescribing on eight patient scenarios were measured at baseline and post-intervention. Two theory-based interventions, a "graded task" with "action planning" and a "persuasive communication", were incorporated into the post-intervention questionnaire. Trial groups were compared using co-variate analyses. Results Post-intervention questionnaires were returned for 340/397 (86%) GPs who responded to the baseline survey. Each intervention had a significant effect on its targeted behavioural belief: compared to those not receiving the intervention GPs completing Intervention 1 reported stronger self-efficacy scores (Beta = 1.41, 95% CI: 0.64 to 2.25) and GPs completing Intervention 2 had more positive anticipated consequences scores (Beta = 0.98, 95% CI = 0.46 to 1.98). Intervention 2 had a significant effect on intention (Beta = 0.90, 95% CI = 0.41 to 1.38) and simulated behaviour (Beta = 0.47, 95% CI = 0.19 to 0.74). Conclusion GPs' intended management of URTI was significantly influenced by their confidence in their ability to manage URTI without antibiotics and the consequences they anticipated as a result of doing so. Two targeted behaviour change interventions differentially affected these beliefs. One
International Nuclear Information System (INIS)
Cooper, F.
1996-01-01
We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations
A Biblical-Theological Model of Cognitive Dissonance Theory: Relevance for Christian Educators
Bowen, Danny Ray
2012-01-01
The purpose of this content analysis research was to develop a biblical-theological model of Cognitive Dissonance Theory applicable to pedagogy. Evidence of cognitive dissonance found in Scripture was used to infer a purpose for the innate drive toward consonance. This inferred purpose was incorporated into a model that improves the descriptive…
The Current Evidence for Hayek’s Cultural Group Selection Theory
Directory of Open Access Journals (Sweden)
Brad Lowell Stone
2010-12-01
Full Text Available In this article I summarize Friedrich Hayek’s cultural group selection theory and describe the evidence gathered by current cultural group selection theorists within the behavioral and social sciences supporting Hayek’s main assertions. I conclude with a few comments on Hayek and libertarianism.
Chaos Theory as a Model for Managing Issues and Crises.
Murphy, Priscilla
1996-01-01
Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
A Leadership Identity Development Model: Applications from a Grounded Theory
Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.
2006-01-01
This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
Theory and modeling of active brazing.
Energy Technology Data Exchange (ETDEWEB)
van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.
2013-09-01
Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.
Domain Theory, Its Models and Concepts
DEFF Research Database (Denmark)
Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt
2014-01-01
Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contribution...
An Evolutionary Game Theory Model of Spontaneous Brain Functioning.
Madeo, Dario; Talarico, Agostino; Pascual-Leone, Alvaro; Mocenni, Chiara; Santarnecchi, Emiliano
2017-11-22
Our brain is a complex system of interconnected regions spontaneously organized into distinct networks. The integration of information between and within these networks is a continuous process that can be observed even when the brain is at rest, i.e. not engaged in any particular task. Moreover, such spontaneous dynamics show predictive value over individual cognitive profile and constitute a potential marker in neurological and psychiatric conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN), able to capture brain's interregional dynamics by balancing emulative and non-emulative attitudes among brain regions. This results in the net behavior of nodes composing resting-state networks identified using functional magnetic resonance imaging (fMRI), determining their moment-to-moment level of activation and inhibition as expressed by positive and negative shifts in BOLD fMRI signal. By spontaneously generating low-frequency oscillatory behaviors, the EGN model is able to mimic functional connectivity dynamics, approximate fMRI time series on the basis of initial subset of available data, as well as simulate the impact of network lesions and provide evidence of compensation mechanisms across networks. Results suggest evolutionary game theory on networks as a new potential framework for the understanding of human brain network dynamics.
Big bang models in string theory
Energy Technology Data Exchange (ETDEWEB)
Craps, Ben [Theoretische Natuurkunde, Vrije Universiteit Brussel and The International Solvay Institutes Pleinlaan 2, B-1050 Brussels (Belgium)
2006-11-07
These proceedings are based on lectures delivered at the 'RTN Winter School on Strings, Supergravity and Gauge Theories', CERN, 16-20 January 2006. The school was mainly aimed at PhD students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne orbifold and the matrix big bang.
Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E
2018-04-13
To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.
The Standard Model is Natural as Magnetic Gauge Theory
DEFF Research Database (Denmark)
Sannino, Francesco
2011-01-01
matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...
Combining morphometric evidence from multiple registration methods using dempster-shafer theory
Rajagopalan, Vidya; Wyatt, Christopher
2010-03-01
In tensor-based morphometry (TBM) group-wise differences in brain structure are measured using high degreeof- freedom registration and some form of statistical test. However, it is known that TBM results are sensitive to both the registration method and statistical test used. Given the lack of an objective model of group variation is it difficult to determine a best registration method for TBM. The use of statistical tests is also problematic given the corrections required for multiple testing and the notorius difficulty selecting and intepreting signigance values. This paper presents an approach to address both of these issues by combining multiple registration methods using Dempster-Shafer Evidence theory to produce belief maps of categorical changes between groups. This approach is applied to the comparison brain morphometry in aging, a typical application of TBM, using the determinant of the Jacobian as a measure of volume change. We show that the Dempster-Shafer combination produces a unique and easy to interpret belief map of regional changes between and within groups without the complications associated with hypothesis testing.
International Nuclear Information System (INIS)
Lo, C.K.; Zio, E.
2015-01-01
In nuclear power plant (NPP) probability risk assessment (PRA) practice, a ranking of the contribution of the single Basic Events (BE) to the Core Damage Frequency (CDF) is performed by computing importance measures, such as the Fussel-Vesely (F-V), Risk Achievement Worth (RAW) and Risk Reduction Worth (RRW) indices. Traditionally, these importance indices are calculated as point (e.g., mean) values without accounting for the epistemic uncertainty affecting the parameters (e.g., BE probabilities, failures rates...) of PRA models. On the other hand, such epistemic uncertainty has a significant impact on the evaluation of the importance indices (that are thus not described by a single point value, but rather by a distribution of possible values): this obviously affects the BE ranking and the corresponding safety-related decisions. In this paper, the epistemic uncertainty in the BE probabilities of NPP PRA modes is represented by belief/plausibility functions within a Dempster-Shafer Theory of Evidence (DSTE) framework: as a consequence, also the corresponding importance indices are described by Dempster-Shafer structures. Due to the overlap and the dependences of focal intervals of component important measures, it is difficult to rank them. We propose a method called RAWC to rank the BE importance with accounting for the uncertainty. However, RWAC can only give us an overall picture about ranking
Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory
International Nuclear Information System (INIS)
Chung, S.; Tye, S.H.
1993-01-01
The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory
Spatial data modelling and maximum entropy theory
Czech Academy of Sciences Publication Activity Database
Klimešová, Dana; Ocelíková, E.
2005-01-01
Roč. 51, č. 2 (2005), s. 80-83 ISSN 0139-570X Institutional research plan: CEZ:AV0Z10750506 Keywords : spatial data classification * distribution function * error distribution Subject RIV: BD - Theory of Information
Electroweak theory and the Standard Model
CERN. Geneva; Giudice, Gian Francesco
2004-01-01
There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.
Statistical Learning Theory: Models, Concepts, and Results
von Luxburg, Ulrike; Schoelkopf, Bernhard
2008-01-01
Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.
Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; White, Peter; Yardley, Lucy; Bishop, Felicity L
2016-06-10
According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative 'think aloud' study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients' stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants' experiences of using the website. We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes
Glass Durability Modeling, Activated Complex Theory (ACT)
International Nuclear Information System (INIS)
CAROL, JANTZEN
2005-01-01
atomic ratios is shown to represent the structural effects of the glass on the dissolution and the formation of activated complexes in the glass leached layer. This provides two different methods by which a linear glass durability model can be formulated. One based on the quasi- crystalline mineral species in a glass and one based on cation ratios in the glass: both are related to the activated complexes on the surface by the law of mass action. The former would allow a new Thermodynamic Hydration Energy Model to be developed based on the hydration of the quasi-crystalline mineral species if all the pertinent thermodynamic data were available. Since the pertinent thermodynamic data is not available, the quasi-crystalline mineral species and the activated complexes can be related to cation ratios in the glass by the law of mass action. The cation ratio model can, thus, be used by waste form producers to formulate durable glasses based on fundamental structural and activated complex theories. Moreover, glass durability model based on atomic ratios simplifies HLW glass process control in that the measured ratios of only a few waste components and glass formers can be used to predict complex HLW glass performance with a high degree of accuracy, e.g. an R 2 approximately 0.97
Solid modeling and applications rapid prototyping, CAD and CAE theory
Um, Dugan
2016-01-01
The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...
The logical foundations of scientific theories languages, structures, and models
Krause, Decio
2016-01-01
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...
Modeling self on others: An import theory of subjectivity and selfhood.
Prinz, Wolfgang
2017-03-01
This paper outlines an Import Theory of subjectivity and selfhood. Import theory claims that subjectivity is initially perceived as a key feature of other minds before it then becomes imported from other minds to own minds whereby it lays the ground for mental selfhood. Import theory builds on perception-production matching, which in turn draws on both representational mechanisms and social practices. Representational mechanisms rely on common coding of perception and production. Social practices rely on action mirroring in dyadic interactions. The interplay between mechanisms and practices gives rise to model self on others. Individuals become intentional agents in virtue of perceiving others mirroring themselves. The outline of the theory is preceded by an introductory section that locates import theory in the broader context of competing approaches, and it is followed by a concluding section that assesses import theory in terms of empirical evidence and explanatory power. Copyright © 2017 Elsevier Inc. All rights reserved.
Supersymmetry and String Theory: Beyond the Standard Model
International Nuclear Information System (INIS)
Rocek, Martin
2007-01-01
When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)
Introduction to gauge theories and the Standard Model
de Wit, Bernard
1995-01-01
The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.
A 'theory of everything'? [Extending the Standard Model
International Nuclear Information System (INIS)
Ross, G.G.
1993-01-01
The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)
Neutron Star Models in Alternative Theories of Gravity
Manolidis, Dimitrios
We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.
Generalized algebra-valued models of set theory
Löwe, B.; Tarafder, S.
2015-01-01
We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.
A QCD Model Using Generalized Yang-Mills Theory
International Nuclear Information System (INIS)
Wang Dianfu; Song Heshan; Kou Lina
2007-01-01
Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.
A review of organizational buyer behaviour models and theories ...
African Journals Online (AJOL)
Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...
Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.
McPhail, Beverly A
2016-07-01
The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy. © The Author(s) 2015.
The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives
Badesa, Calixto
2008-01-01
Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali
Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory
Pei, Di; Yue, Jianhai; Jiao, Jing
2017-10-01
This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.
Non-linear σ-models and string theories
International Nuclear Information System (INIS)
Sen, A.
1986-10-01
The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs
Restoring primacy in amnesic free recall: evidence for the recency theory of primacy.
Dewar, Michaela; Brown, Gordon D A; Della Sala, Sergio
2011-09-01
Primacy and recency effects at immediate recall are thought to reflect the independent functioning of a long-term memory store (primacy) and a short-term memory store (recency). Key evidence for this theory comes from amnesic patients who show severe long-term memory storage deficits, coupled with profoundly attenuated primacy. Here we challenge this dominant dual-store theory of immediate recall by demonstrating that attenuated primacy in amnesic patients can reflect abnormal working memory rehearsal processes. D.A., a patient with severe amnesia, presented with profoundly attenuated primacy when using her preferred atypical noncumulative rehearsal strategy. In contrast, despite her severe amnesia, she showed normal primacy when her rehearsal was matched with that of controls via an externalized cumulative rehearsal schedule. Our data are in keeping with the "recency theory of primacy" and suggest that primacy at immediate recall is dependent upon medial temporal lobe involvement in cumulative rehearsal rather than long-term memory storage.
Greenslade, Kathryn J; Coggins, Truman E
2016-08-01
This study presents an independent replication and extension of psychometric evidence supporting the Theory of Mind Inventory (ToMI). Parents of 20 children with ASD (4; 1-6; 7 years; months) and 20 with typical development (3; 1-6; 5), rated their child's theory of mind abilities in everyday situations. Other parent report and child behavioral assessments included the Social Responsiveness Scale-2, Vineland Adaptive Behavior Scales-2, Peabody Picture Vocabulary Test-4, and Clinical Evaluation of Language Fundamentals-Preschool, 2. Results revealed high internal consistency, expected developmental changes in children with typical development, expected group differences between children with and without ASD, and strong correlations with other measures of social and communication abilities. The ToMI demonstrates strong psychometrics, suggesting considerable utility in identifying theory of mind deficits in children with ASD.
Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping
Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E
2013-01-01
Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Interv...
Purchasing power parity theory in three East Asian economies: New evidence
Ahmad, Mahyudin; Marwan, Nur Fakhzan
2012-01-01
To an otherwise extensive literature with yet mixed findings on the long run Purchasing Power Parity (PPP) theory, this paper extends the evidence against the PPP hypothesis in three East Asian economies namely Indonesia, Malaysia, and Thailand based on quarterly data spanning forty years (1968:Q1-2008:Q1). The testing of PPP hypothesis in this study employs two methods namely Engle-Granger procedure and Johansen multivariate cointegration method.
Evidence-Based Theory of Market Manipulation And Application: The Malaysian Case
Heong, Yin Yun
2010-01-01
According to Part IX Division 1 in Securities Industry Act 1983 of Malaysia Law, stock market manipulation is defined as unlawful action taken either direct or indirectly by any person, to affect the price of securities of the corporation on a stock market in Malaysia for the purpose which may include the purpose of inducing other persons. Extending the framework of Allen and Gale (1992), the Author presents a theory based on the empirical evidence from prosecuted stock market manipulation ca...
Toric Methods in F-Theory Model Building
Directory of Open Access Journals (Sweden)
Johanna Knapp
2011-01-01
Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.
Theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena
Jin, Yongmei M.; Wang, Yu U.; Ren, Yang
2015-12-01
Pre-martensitic phenomena, also called martensite precursor effects, have been known for decades while yet remain outstanding issues. This paper addresses pre-martensitic phenomena from new theoretical and experimental perspectives. A statistical mechanics-based Grüneisen-type phonon theory is developed. On the basis of deformation-dependent incompletely softened low-energy phonons, the theory predicts a lattice instability and pre-martensitic transition into elastic-phonon domains via 'phonon spinodal decomposition.' The phase transition lifts phonon degeneracy in cubic crystal and has a nature of phonon pseudo-Jahn-Teller lattice instability. The theory and notion of phonon domains consistently explain the ubiquitous pre-martensitic anomalies as natural consequences of incomplete phonon softening. The phonon domains are characterised by broken dynamic symmetry of lattice vibrations and deform through internal phonon relaxation in response to stress (a particular case of Le Chatelier's principle), leading to previously unexplored new domain phenomenon. Experimental evidence of phonon domains is obtained by in situ three-dimensional phonon diffuse scattering and Bragg reflection using high-energy synchrotron X-ray single-crystal diffraction, which observes exotic domain phenomenon fundamentally different from usual ferroelastic domain switching phenomenon. In light of the theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena, currently existing alternative opinions on martensitic precursor phenomena are revisited.
Quantum Link Models and Quantum Simulation of Gauge Theories
International Nuclear Information System (INIS)
Wiese, U.J.
2015-01-01
This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)
Substandard model? At last, a good reason to opt for a sexier theory of particle physics
Cho, A
2001-01-01
According to experimenters at Brookhaven, a tiny discrepancy in the magnetism of the muon may signal a crack in the Standard Model. The deviation could be the first piece of hard evidence for a more complete theory called supersymmetry (1 page).
Optimization models using fuzzy sets and possibility theory
Orlovski, S
1987-01-01
Optimization is of central concern to a number of discip lines. Operations Research and Decision Theory are often consi dered to be identical with optimizationo But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i. e. the solutions were considered to be either fea sible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeller to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real prob lems. This is particularly true if the problem under considera tion includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if na tural language has to be...
International Nuclear Information System (INIS)
Oliveira, A.C.J.G. de; Andrade Lima, F.R. de
1989-01-01
The present work is an application of the perturbation theory (Matricial formalism) to a simplified two channels model, for sensitivity calculations in PWR cores. Expressions for some sensitivity coefficients of thermohydraulic interest were developed from the proposed model. The code CASNUR.FOR was written in FORTRAN to evaluate these sensitivity coefficients. The comparison between results obtained from the matrical formalism of pertubation theory with those obtained directly from the two channels model, makes evident the efficiency and potentiality of this perturbation method for nuclear reactor cores sensitivity calculations. (author) [pt
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
Directory of Open Access Journals (Sweden)
Ya Zheng
2017-09-01
Full Text Available Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model. Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.
Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun
2017-01-01
Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm. PMID:28974937
Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun
2017-01-01
Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.
The Self-Perception Theory vs. a Dynamic Learning Model
Swank, Otto H.
2006-01-01
Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...
Weighted score-level feature fusion based on Dempster-Shafer evidence theory for action recognition
Zhang, Guoliang; Jia, Songmin; Li, Xiuzhi; Zhang, Xiangyin
2018-01-01
The majority of human action recognition methods use multifeature fusion strategy to improve the classification performance, where the contribution of different features for specific action has not been paid enough attention. We present an extendible and universal weighted score-level feature fusion method using the Dempster-Shafer (DS) evidence theory based on the pipeline of bag-of-visual-words. First, the partially distinctive samples in the training set are selected to construct the validation set. Then, local spatiotemporal features and pose features are extracted from these samples to obtain evidence information. The DS evidence theory and the proposed rule of survival of the fittest are employed to achieve evidence combination and calculate optimal weight vectors of every feature type belonging to each action class. Finally, the recognition results are deduced via the weighted summation strategy. The performance of the established recognition framework is evaluated on Penn Action dataset and a subset of the joint-annotated human metabolome database (sub-JHMDB). The experiment results demonstrate that the proposed feature fusion method can adequately exploit the complementarity among multiple features and improve upon most of the state-of-the-art algorithms on Penn Action and sub-JHMDB datasets.
Evidence of "Implemented Anticipation" in Mathematising by Beginning Modellers
Stillman, Gloria; Brown, Jill P.
2014-01-01
Data from open modelling sessions for year 10 and 11 students at an extracurricular modelling event and from a year 9 class participating in a programme of structured modelling of real situations were analysed for evidence of Niss's theoretical construct, "implemented anticipation," during mathematisation. Evidence was found for all…
Measurement Models for Reasoned Action Theory
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-01-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...
Modeling Routinization in Games: An Information Theory Approach
DEFF Research Database (Denmark)
Wallner, Simon; Pichlmair, Martin; Hecher, Michael
2015-01-01
Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...
Internal Universes in Models of Homotopy Type Theory
DEFF Research Database (Denmark)
Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.
2018-01-01
We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...
Theory, modeling, and simulation annual report, 1992
Energy Technology Data Exchange (ETDEWEB)
1993-05-01
This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.
Theories of conduct disorder: a causal modelling analysis
Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De
2004-01-01
Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –
Models of Regge behaviour in an asymptotically free theory
International Nuclear Information System (INIS)
Polkinghorne, J.C.
1976-01-01
Two simple Feynman integral models are presented which reproduce the features expected to be of physical importance in the Regge behaviour of asymptotically free theories. Analysis confirms the result, expected on general grounds, that phi 3 in six dimensions has an essential singularity at l=-1. The extension to gauge theories is discussed. (Auth.)
Teo, Timothy; Tan, Lynde
2012-01-01
This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Anisotropic cosmological models in f (R, T) theory of gravitation
Indian Academy of Sciences (India)
indirect evidence for the late time accelerated expansion of the Universe. ... Bertolami et al [9] proposed a generalization of f (R) theory of gravity ..... For the purpose of reference, we set the origin of the time coordinate at the bounce of.
Extended Nambu models: Their relation to gauge theories
Escobar, C. A.; Urrutia, L. F.
2017-05-01
Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.
Linear control theory for gene network modeling.
Shin, Yong-Jun; Bleris, Leonidas
2010-09-16
Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Klaassen, Ger; Nentjes, Andries; Smith, Mark
2005-01-01
Simulation models and theory prove that emission trading converges to market equilibrium. This paper sets out to test these results using experimental economics. Three experiments are conducted for the six largest carbon emitting industrialized regions. Two experiments use auctions, the first a
Attachment and the Processing of Social Information across the Life Span: Theory and Evidence
Dykas, Matthew J.; Cassidy, Jude
2011-01-01
Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the…
High-pressure melting curve of KCl: Evidence against lattice-instability theories of melting
International Nuclear Information System (INIS)
Ross, M.; Wolf, G.
1986-01-01
We show that the large curvature in the T-P melting curve of KCl is the result of a reordering of the liquid to a more densely packed arrangement. As a result theories of melting, such as the instability model, which do not take into account the structure of the liquid fail to predict the correct pressure dependence of the melting curve
Polling models : from theory to traffic intersections
Boon, M.A.A.
2011-01-01
The subject of the present monograph is the study of polling models, which are queueing models consisting of multiple queues, cyclically attended by one server. Polling models originated in the late 1950s, but did not receive much attention until the 1980s when an abundance of new applications arose
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
Contribution to the study of conformal theories and integrable models
International Nuclear Information System (INIS)
Sochen, N.
1992-05-01
The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved
Three level constraints on conformal field theories and string models
International Nuclear Information System (INIS)
Lewellen, D.C.
1989-05-01
Simple tree level constraints for conformal field theories which follow from the requirement of crossing symmetry of four-point amplitudes are presented, and their utility for probing general properties of string models is briefly illustrated and discussed. 9 refs
Nematic elastomers: from a microscopic model to macroscopic elasticity theory.
Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette
2008-05-01
A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.
Soliton excitations in a class of nonlinear field theory models
International Nuclear Information System (INIS)
Makhan'kov, V.G.; Fedyanin, V.K.
1985-01-01
Investigation results of nonlinear models of the field theory with a lagrangian are described. The theory includes models both with zero stable vacuum epsilon=1 and with condensate epsilon=-1 (of disturbed symmetry). Conditions of existence of particle-like solutions (PLS), stability of these solutions are investigated. Soliton dynamics is studied. PLS formfactors are calculated. Statistical mechanics of solitons is built and their dynamic structure factors are calculated
Two-matrix models and c =1 string theory
International Nuclear Information System (INIS)
Bonora, L.; Xiong Chuansheng
1994-05-01
We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)
Planar N = 4 gauge theory and the Hubbard model
International Nuclear Information System (INIS)
Rej, Adam; Serban, Didina; Staudacher, Matthias
2006-01-01
Recently it was established that a certain integrable long-range spin chain describes the dilatation operator of N = 4 gauge theory in the su(2) sector to at least three-loop order, while exhibiting BMN scaling to all orders in perturbation theory. Here we identify this spin chain as an approximation to an integrable short-ranged model of strongly correlated electrons: The Hubbard model
Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping
Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E
2014-01-01
Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. PMID:24811880
Fracture flow modelling. Proof of evidence
International Nuclear Information System (INIS)
Hencher, S.R.
1996-01-01
Proof of Evidence by an expert witness is presented in support of the case by Friends of the Earth (FOE) against the proposed construction by UK Nirex Ltd of an underground Rock Characterisation Facility (RCF) at a site in the Sellafield area. The RCF is part of an investigation by Nirex into a suitable site for an underground repository for the disposal of radioactive waste. The objections were raised at a Planning Inquiry in 1995. The evidence points out that current understanding of the factors which control flow through a network of interconnecting fractures, such as that at the Sellafield site, is at a very early stage of development. Neither are the methods of investigation and analysis required for a post-closure performance assessment (PCPA) for a repository well developed. These issues are being investigated in international underground research laboratories but the proposed RCF is intended to be confirmatory rather than experimental. (23 references). (UK)
Scattering and short-distance properties in field theory models
International Nuclear Information System (INIS)
Iagolnitzer, D.
1987-01-01
The aim of constructive field theory is not only to define models but also to establish their general properties of physical interest. We here review recent works on scattering and on short-distance properties for weakly coupled theories with mass gap such as typically P(φ) in dimension 2, φ 4 in dimension 3 and the (renormalizable, asymptotically free) massive Gross-Neveu (GN) model in dimension 2. Many of the ideas would apply similarly to other (possibly non renormalizable) theories that might be defined in a similar way via phase-space analysis
The monster sporadic group and a theory underlying superstring models
International Nuclear Information System (INIS)
Chapline, G.
1996-09-01
The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs
Consumer preference models: fuzzy theory approach
Turksen, I. B.; Wilson, I. A.
1993-12-01
Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).
Narrative theories as computational models: reader-oriented theory and artificial intelligence
Energy Technology Data Exchange (ETDEWEB)
Galloway, P.
1983-12-01
In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.
A Dynamic Systems Theory Model of Visual Perception Development
Coté, Carol A.
2015-01-01
This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…
Membrane models and generalized Z2 gauge theories
International Nuclear Information System (INIS)
Lowe, M.J.; Wallace, D.J.
1980-01-01
We consider models of (d-n)-dimensional membranes fluctuating in a d-dimensional space under the action of surface tension. We investigate the renormalization properties of these models perturbatively and in 1/n expansion. The potential relationships of these models to generalized Z 2 gauge theories are indicated. (orig.)
Theories and Frameworks for Online Education: Seeking an Integrated Model
Picciano, Anthony G.
2017-01-01
This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…
Linear control theory for gene network modeling.
Directory of Open Access Journals (Sweden)
Yong-Jun Shin
Full Text Available Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain and linear state-space (time domain can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Measurement Models for Reasoned Action Theory.
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-03-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.
Modeling acquaintance networks based on balance theory
Directory of Open Access Journals (Sweden)
Vukašinović Vida
2014-09-01
Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models
Modeling in applied sciences a kinetic theory approach
Pulvirenti, Mario
2000-01-01
Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...
Baldrige Theory into Practice: A Generic Model
Arif, Mohammed
2007-01-01
Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…
A Reflection on Research, Theory, Evidence-based Practice, and Quality Improvement
Directory of Open Access Journals (Sweden)
Eesa Mohammadi
2016-04-01
While each process is associated with its unique characteristics, overlaps are likely to appear between each of the two processes. For instance, in the EBP process, if one discovers (theory that evidence is inadequate to implement a certain intervention, it highlights the need for research on that specific subject. Similarly, QI may lead to the identification of new questions, which could be used for research purposes. All the discussed processes, as well as their scientific and professional dimensions, are essential to nursing disciplines in healthcare systems.
Optimal transportation networks models and theory
Bernot, Marc; Morel, Jean-Michel
2009-01-01
The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.
Selection Bias in Educational Transition Models: Theory and Empirical Evidence
DEFF Research Database (Denmark)
Holm, Anders; Jæger, Mads
variables. This paper, first, explains theoretically how selection on unobserved variables leads to waning coefficients and, second, illustrates empirically how selection leads to biased estimates of the effect of family background on educational transitions. Our empirical analysis using data from...
Intervention mapping: a process for developing theory- and evidence-based health education programs.
Bartholomew, L K; Parcel, G S; Kok, G
1998-10-01
The practice of health education involves three major program-planning activities: needs assessment, program development, and evaluation. Over the past 20 years, significant enhancements have been made to the conceptual base and practice of health education. Models that outline explicit procedures and detailed conceptualization of community assessment and evaluation have been developed. Other advancements include the application of theory to health education and promotion program development and implementation. However, there remains a need for more explicit specification of the processes by which one uses theory and empirical findings to develop interventions. This article presents the origins, purpose, and description of Intervention Mapping, a framework for health education intervention development. Intervention Mapping is composed of five steps: (1) creating a matrix of proximal program objectives, (2) selecting theory-based intervention methods and practical strategies, (3) designing and organizing a program, (4) specifying adoption and implementation plans, and (5) generating program evaluation plans.
International Nuclear Information System (INIS)
Hsu, T.C.T.
1989-01-01
This thesis describes work on a large-U Hubbard model theory for high temperature superconductors. After an introduction to recent developments in the field, the author reviews experimental results. At the same time he introduces the holon-spinon model and comment on its successes and shortcomings. Using this heuristic model he then describes a holon pairing theory of superconductivity and list some experimental evidence for this interlayer coupling theory. The latter part of the thesis is devoted to projected fermion mean field theories. They are introduced by applying this theory and some recently developed computational techniques to anisotropic antiferromagnets. This scheme is shown to give quantitatively good results for the two dimensional square lattice Heisenberg AFM. The results have definite implications for a spinon theory of quantum antiferromagnets. Finally he studies flux phases and other variational prescriptions for obtaining low lying states of the Hubbard model
The Relevance of Using Mathematical Models in Macroeconomic Policies Theory
Directory of Open Access Journals (Sweden)
Nora Mihail
2006-11-01
Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.
The Relevance of Using Mathematical Models in Macroeconomic Policies Theory
Directory of Open Access Journals (Sweden)
Nora Mihail
2006-09-01
Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.
Fire and Heat Spreading Model Based on Cellular Automata Theory
Samartsev, A. A.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.; Fominykh, D. S.
2018-05-01
The distinctive feature of the proposed fire and heat spreading model in premises is the reduction of the computational complexity due to the use of the theory of cellular automata with probability rules of behavior. The possibilities and prospects of using this model in practice are noted. The proposed model has a simple mechanism of integration with agent-based evacuation models. The joint use of these models could improve floor plans and reduce the time of evacuation from premises during fires.
Matrix model as a mirror of Chern-Simons theory
International Nuclear Information System (INIS)
Aganagic, Mina; Klemm, Albrecht; Marino, Marcos; Vafa, Cumrun
2004-01-01
Using mirror symmetry, we show that Chern-Simons theory on certain manifolds such as lens spaces reduces to a novel class of Hermitian matrix models, where the measure is that of unitary matrix models. We show that this agrees with the more conventional canonical quantization of Chern-Simons theory. Moreover, large N dualities in this context lead to computation of all genus A-model topological amplitudes on toric Calabi-Yau manifolds in terms of matrix integrals. In the context of type IIA superstring compactifications on these Calabi-Yau manifolds with wrapped D6 branes (which are dual to M-theory on G2 manifolds) this leads to engineering and solving F-terms for N=1 supersymmetric gauge theories with superpotentials involving certain multi-trace operators. (author)
Meredith, Pamela Joy
2013-04-01
Theoretical and empirical evidence suggests that adult attachment and pain-related variables are predictably and consistently linked, and that understanding these links may guide pain intervention and prevention efforts. In general, insecure attachment has been portrayed as a risk factor, and secure attachment as a protective factor, for people with chronic pain conditions. In an effort to better understand the relationships among attachment and pain variables, these links have been investigated in pain-free samples using induced-pain techniques. The present paper reviews the available research linking adult attachment and laboratory-induced pain. While the diverse nature of the studies precludes definitive conclusions, together these papers offer support for associations between insecure attachment and a more negative pain experience. The evidence presented in this review highlights areas for further empirical attention, as well as providing some guidance for clinicians who may wish to employ preventive approaches and other interventions informed by attachment theory.
The Role of Adolescent Development in Social Networking Site Use: Theory and Evidence
Directory of Open Access Journals (Sweden)
Drew P. Cingel
2014-03-01
Full Text Available Using survey data collected from 260 children, adolescents, and young adults between the ages of 9 and 26, this paper offers evidence for a relationship between social networking site use and Imaginary Audience, a developmental variable in which adolescents believe others are thinking about them at all times. Specifically, after controlling for a number of variables, results indicate a significant, positive relationship between social networking site use and Imaginary Audience ideation. Additionally, results indicate a positive relationship between Imaginary Audience ideation and Facebook customization practices. Together, these findings provide evidence, based on Vygotskian developmental theory, for a general consideration of the role that currently available tools, in this case social networking sites, can have on development. Thus, findings implicate both the role of development on social networking site use, as well as the role of social networking site use on development. Overall, these findings have important implications for the study of media and human development, which are discussed in detail.
Contemporary Cognitive Behavior Therapy: A Review of Theory, History, and Evidence.
Thoma, Nathan; Pilecki, Brian; McKay, Dean
2015-09-01
Cognitive behavior therapy (CBT) has come to be a widely practiced psychotherapy throughout the world. The present article reviews theory, history, and evidence for CBT. It is meant as an effort to summarize the forms and scope of CBT to date for the uninitiated. Elements of CBT such as cognitive therapy, behavior therapy, and so-called "third wave" CBT, such as dialectical behavior therapy (DBT) and acceptance and commitment therapy (ACT) are covered. The evidence for the efficacy of CBT for various disorders is reviewed, including depression, anxiety disorders, personality disorders, eating disorders, substance abuse, schizophrenia, chronic pain, insomnia, and child/adolescent disorders. The relative efficacy of medication and CBT, or their combination, is also briefly considered. Future directions for research and treatment development are proposed.
Mixed models theory and applications with R
Demidenko, Eugene
2013-01-01
Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g
Nonconvex Model of Material Growth: Mathematical Theory
Ganghoffer, J. F.; Plotnikov, P. I.; Sokolowski, J.
2018-06-01
The model of volumetric material growth is introduced in the framework of finite elasticity. The new results obtained for the model are presented with complete proofs. The state variables include the deformations, temperature and the growth factor matrix function. The existence of global in time solutions for the quasistatic deformations boundary value problem coupled with the energy balance and the evolution of the growth factor is shown. The mathematical results can be applied to a wide class of growth models in mechanics and biology.
International Nuclear Information System (INIS)
Ingale, S. V.; Datta, D.
2010-01-01
Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.
Solid mechanics theory, modeling, and problems
Bertram, Albrecht
2015-01-01
This textbook offers an introduction to modeling the mechanical behavior of solids within continuum mechanics and thermodynamics. To illustrate the fundamental principles, the book starts with an overview of the most important models in one dimension. Tensor calculus, which is called for in three-dimensional modeling, is concisely presented in the second part of the book. Once the reader is equipped with these essential mathematical tools, the third part of the book develops the foundations of continuum mechanics right from the beginning. Lastly, the book’s fourth part focuses on modeling the mechanics of materials and in particular elasticity, viscoelasticity and plasticity. Intended as an introductory textbook for students and for professionals interested in self-study, it also features numerous worked-out examples to aid in understanding.
Modeling workplace bullying using catastrophe theory.
Escartin, J; Ceja, L; Navarro, J; Zapf, D
2013-10-01
Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.
Spatial interaction models facility location using game theory
D'Amato, Egidio; Pardalos, Panos
2017-01-01
Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.
Electrorheological fluids modeling and mathematical theory
Růžička, Michael
2000-01-01
This is the first book to present a model, based on rational mechanics of electrorheological fluids, that takes into account the complex interactions between the electromagnetic fields and the moving liquid. Several constitutive relations for the Cauchy stress tensor are discussed. The main part of the book is devoted to a mathematical investigation of a model possessing shear-dependent viscosities, proving the existence and uniqueness of weak and strong solutions for the steady and the unsteady case. The PDS systems investigated possess so-called non-standard growth conditions. Existence results for elliptic systems with non-standard growth conditions and with a nontrivial nonlinear r.h.s. and the first ever results for parabolic systems with a non-standard growth conditions are given for the first time. Written for advanced graduate students, as well as for researchers in the field, the discussion of both the modeling and the mathematics is self-contained.
Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C; Escoffery, Cam T; Herrmann, Alison K; Thatcher, Esther; Hartman, Marieke A; Fernandez, Maria E
2017-02-01
Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners' capacity to adopt and implement a variety of EBIs across diverse practice contexts.
Evidence for the epistemic view of quantum states: A toy theory
International Nuclear Information System (INIS)
Spekkens, Robert W.
2007-01-01
We present a toy theory that is based on a simple principle: the number of questions about the physical state of a system that are answered must always be equal to the number that are unanswered in a state of maximal knowledge. Many quantum phenomena are found to have analogues within this toy theory. These include the noncommutativity of measurements, interference, the multiplicity of convex decompositions of a mixed state, the impossibility of discriminating nonorthogonal states, the impossibility of a universal state inverter, the distinction between bipartite and tripartite entanglement, the monogamy of pure entanglement, no cloning, no broadcasting, remote steering, teleportation, entanglement swapping, dense coding, mutually unbiased bases, and many others. The diversity and quality of these analogies is taken as evidence for the view that quantum states are states of incomplete knowledge rather than states of reality. A consideration of the phenomena that the toy theory fails to reproduce, notably, violations of Bell inequalities and the existence of a Kochen-Specker theorem, provides clues for how to proceed with this research program
Density functional theory and multiscale materials modeling
Indian Academy of Sciences (India)
One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids.
Toda theories, W-algebras, and minimal models
International Nuclear Information System (INIS)
Mansfield, P.; Spence, B.
1991-01-01
We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)
Automated Physico-Chemical Cell Model Development through Information Theory
Energy Technology Data Exchange (ETDEWEB)
Peter J. Ortoleva
2005-11-29
The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.
Computational hemodynamics theory, modelling and applications
Tu, Jiyuan; Wong, Kelvin Kian Loong
2015-01-01
This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system. Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...
Fuzzy Stochastic Optimization Theory, Models and Applications
Wang, Shuming
2012-01-01
Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies. The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...
Nonlinear model predictive control theory and algorithms
Grüne, Lars
2017-01-01
This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...
An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories
International Nuclear Information System (INIS)
Schiappa, Ricardo; Wyllard, Niclas
2010-01-01
We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.
Lenses on reading an introduction to theories and models
Tracey, Diane H
2017-01-01
Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a
Perturbation theory instead of large scale shell model calculations
International Nuclear Information System (INIS)
Feldmeier, H.; Mankos, P.
1977-01-01
Results of large scale shell model calculations for (sd)-shell nuclei are compared with a perturbation theory provides an excellent approximation when the SU(3)-basis is used as a starting point. The results indicate that perturbation theory treatment in an SU(3)-basis including 2hω excitations should be preferable to a full diagonalization within the (sd)-shell. (orig.) [de
Scaling theory of depinning in the Sneppen model
International Nuclear Information System (INIS)
Maslov, S.; Paczuski, M.
1994-01-01
We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. 69, 3539 (1992)]. This theory is based on a ''gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, ν parallel (d) and ν perpendicular (d), characterizing the divergence of the parallel and perpendicular correlation lengths as the interface approaches its dynamical attractor
Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations
DEFF Research Database (Denmark)
Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing
2007-01-01
Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...
The Use of Modelling for Theory Building in Qualitative Analysis
Briggs, Ann R. J.
2007-01-01
The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…
Goodness-of-Fit Assessment of Item Response Theory Models
Maydeu-Olivares, Alberto
2013-01-01
The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…
Collectivism and coping: current theories, evidence, and measurements of collective coping.
Kuo, Ben C H
2013-01-01
A burgeoning body of cultural coping research has begun to identify the prevalence and the functional importance of collective coping behaviors among culturally diverse populations in North America and internationally. These emerging findings are highly significant as they evidence culture's impacts on the stress-coping process via collectivistic values and orientation. They provide a critical counterpoint to the prevailing Western, individualistic stress and coping paradigm. However, current research and understanding about collective coping appear to be piecemeal and not well integrated. To address this issue, this review attempts to comprehensively survey, summarize, and evaluate existing research related to collective coping and its implications for coping research with culturally diverse populations from multiple domains. Specifically, this paper reviews relevant research and knowledge on collective coping in terms of: (a) operational definitions; (b) theories; (c) empirical evidence based on studies of specific cultural groups and broad cultural values/dimensions; (d) measurements; and (e) implications for future cultural coping research. Overall, collective coping behaviors are conceived as a product of the communal/relational norms and values of a cultural group across studies. They also encompass a wide array of stress responses ranging from value-driven to interpersonally based to culturally conditioned emotional/cognitive to religion- and spirituality-grounded coping strategies. In addition, this review highlights: (a) the relevance and the potential of cultural coping theories to guide future collective coping research; (b) growing evidence for the prominence of collective coping behaviors particularly among Asian nationals, Asian Americans/Canadians and African Americans/Canadians; (c) preference for collective coping behaviors as a function of collectivism and interdependent cultural value and orientation; and (d) six cultural coping scales. This
Adenzato, Mauro; Todisco, Patrizia; Ardito, Rita B
2012-01-01
The findings of the few studies that have to date investigated the way in which individuals with Anorexia Nervosa (AN) navigate their social environment are somewhat contradictory. We undertook this study to shed new light on the social-cognitive profile of patients with AN, analysing Theory of Mind and emotional functioning. Starting from previous evidence on the role of the amygdala in the neurobiology of AN and in the social cognition, we hypothesise preserved Theory of Mind and impaired emotional functioning in patients with AN. Thirty women diagnosed with AN and thirty-two women matched for education and age were involved in the study. Theory of Mind and emotional functioning were assessed with a set of validated experimental tasks. A measure of perceived social support was also used to test the correlations between this dimension and the social-cognitive profile of AN patients. The performance of patients with AN is significantly worse than that of healthy controls on tasks assessing emotional functioning, whereas patients' performance is comparable to that of healthy controls on the Theory of Mind task. Correlation analyses showed no relationship between scores on any of the social-cognition tasks and either age of onset or duration of illness. A correlation between social support and emotional functioning was found. This latter result seems to suggest a potential role of social support in the treatment and recovery of AN. The pattern of results followed the experimental hypothesis. They may be useful to help us better understand the social-cognitive profile of patients with AN and to contribute to the development of effective interventions based on the ways in which patients with AN actually perceive their social environment.
Optimal velocity difference model for a car-following theory
International Nuclear Information System (INIS)
Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.
2011-01-01
In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.
Directory of Open Access Journals (Sweden)
Nasser Al-Horais
2012-11-01
Full Text Available Normal 0 21 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:auto; mso-para-margin-right:1.0cm; mso-para-margin-bottom:auto; mso-para-margin-left:2.0cm; text-align:justify; text-indent:-1.0cm; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi; mso-ansi-language:EN-US; mso-fareast-language:EN-US;} The Minimalist Program is a major line of inquiry that has been developing inside Generative Grammar since the early nineties, when it was proposed by Chomsky (1993, 1995. In that time, Chomsky (1998: 5 presents Minimalist Program as a program, not as a theory, but today, Minimalist Program lays out a very specific view of the basis of syntactic grammar that, when compared to other formalisms, is often taken to look very much like a theory. The prime concern of this paper, however, is to provide a comprehensive and accessible introduction to the art of minimalist approach to the theory of grammar. In this regard, this paper discusses some new ideas articulated recently by Chomsky, and have led to several fundamental improvements in syntactic theory such as changing the function of movement and the Extended Projection Principle (EPP feature, or proposing new theories such as Phases and Feature Inheritance. In order to evidence the significance of these fundamental improvements, the current paper provides a minimalist analysis to account for agreement and word-order asymmetry in Stranded Arabic. This fresh minimalist account meets the challenges (to the basic tenets of syntactic theory occurred
Evidence for the multiverse in the standard model and beyond
International Nuclear Information System (INIS)
Hall, Lawrence J.; Nomura, Yasunori
2008-01-01
In any theory it is unnatural if the observed values of parameters lie very close to special values that determine the existence of complex structures necessary for observers. A naturalness probability P is introduced to numerically evaluate the degree of unnaturalness. If P is very small in all known theories, corresponding to a high degree of fine-tuning, then there is an observer naturalness problem. In addition to the well-known case of the cosmological constant, we argue that nuclear stability and electroweak symmetry breaking represent significant observer naturalness problems. The naturalness probability associated with nuclear stability depends on the theory of flavor, but for all known theories is conservatively estimated as P nuc -3 -10 -2 ), and for simple theories of electroweak symmetry breaking P EWSB -2 -10 -1 ). This pattern of unnaturalness in three different arenas, cosmology, nuclear physics, and electroweak symmetry breaking, provides evidence for the multiverse, since each problem may be easily solved by environmental selection. In the nuclear case the problem is largely solved even if the multiverse distribution for the relevant parameters is relatively flat. With somewhat strongly varying distributions, it is possible to understand both the close proximity to neutron stability and the values of m e and m d -m u in terms of the electromagnetic mass difference between the proton and neutron, δ EM ≅1±0.5 MeV. It is reasonable that multiverse distributions are strong functions of Lagrangian parameters, since they depend not only on the landscape of vacua, but also on the population mechanism, ''integrating out'' other parameters, and on a density of observers factor. In any theory with mass scale M that is the origin of electroweak symmetry breaking, strongly varying multiverse distributions typically lead either to a little hierarchy v/M≅(10 -2 -10 -1 ), or to a large hierarchy v 2 /M 2 suppressed by an extra loop factor, as well as by the
Francis, Jill J; Stockton, Charlotte; Eccles, Martin P; Johnston, Marie; Cuthbertson, Brian H; Grimshaw, Jeremy M; Hyde, Chris; Tinmouth, Alan; Stanworth, Simon J
2009-11-01
Many theories of behaviour are potentially relevant to predictive and intervention studies but most studies investigate a narrow range of theories. Michie et al. (2005) agreed 12 'theoretical domains' from 33 theories that explain behaviour change. They developed a 'Theoretical Domains Interview' (TDI) for identifying relevant domains for specific clinical behaviours, but the framework has not been used for selecting theories for predictive studies. It was used here to investigate clinicians' transfusion behaviour in intensive care units (ICU). Evidence suggests that red blood cells transfusion could be reduced for some patients without reducing quality of care. (1) To identify the domains relevant to transfusion practice in ICUs and neonatal intensive care units (NICUs), using the TDI. (2) To use the identified domains to select appropriate theories for a study predicting transfusion behaviour. An adapted TDI about managing a patient with borderline haemoglobin by watching and waiting instead of transfusing red blood cells was used to conduct semi-structured, one-to-one interviews with 18 intensive care consultants and neonatologists across the UK. Relevant theoretical domains were: knowledge, beliefs about capabilities, beliefs about consequences, social influences, behavioural regulation. Further analysis at the construct level resulted in selection of seven theoretical approaches relevant to this context: Knowledge-Attitude-Behaviour Model, Theory of Planned Behaviour, Social Cognitive Theory, Operant Learning Theory, Control Theory, Normative Model of Work Team Effectiveness and Action Planning Approaches. This study illustrated, the use of the TDI to identify relevant domains in a complex area of inpatient care. This approach is potentially valuable for selecting theories relevant to predictive studies and resulted in greater breadth of potential explanations than would be achieved if a single theoretical model had been adopted.
International Nuclear Information System (INIS)
Klaassen, Ger; Nentjes, Andries; Smith, Mark
2005-01-01
Simulation models and theory prove that emission trading converges to market equilibrium. This paper sets out to test these results using experimental economics. Three experiments are conducted for the six largest carbon emitting industrialized regions. Two experiments use auctions, the first a single bid auction and the second a Walrasian auction. The third relies on bilateral, sequential trading. The paper finds that, in line with the standard theory, both auctions and bilateral, sequential trading capture a significant part (88% to 99%) of the potential cost savings of emission trading. As expected from trade theory, all experiments show that the market price converges (although not fully) to the market equilibrium price. In contrast to the theory, the results also suggest that not every country might gain from trading. In both the bilateral trading experiment and the Walrasian auction, one country actually is worse off with trade. In particular bilateral, sequential trading leads to a distribution of gains significantly different from the competitive market outcome. This is due to speculative behavior, imperfect foresight and market power
Measuring Convergence using Dynamic Equilibrium Models: Evidence from Chinese Provinces
DEFF Research Database (Denmark)
Pan, Lei; Posch, Olaf; van der Wel, Michel
We propose a model to study economic convergence in the tradition of neoclassical growth theory. We employ a novel stochastic set-up of the Solow (1956) model with shocks to both capital and labor. Our novel approach identifies the speed of convergence directly from estimating the parameters which...
Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat
2018-03-01
Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Hannah, David R.; Venkatachary, Ranga
2010-01-01
In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…
Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets
Cifter, Atilla
2011-06-01
This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.
Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model
Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal
How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
M-Theory Model-Building and Proton Stability
Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.
1998-01-01
We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.
M-theory model-building and proton stability
International Nuclear Information System (INIS)
Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens
1997-09-01
The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory
Algebraic computability and enumeration models recursion theory and descriptive complexity
Nourani, Cyrus F
2016-01-01
This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...
Theory to practice: the humanbecoming leading-following model.
Ursel, Karen L
2015-01-01
Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.
Theory of Time beyond the standard model
International Nuclear Information System (INIS)
Poliakov, Eugene S.
2008-01-01
A frame of non-uniform time is discussed. A concept of 'flow of time' is presented. The principle of time relativity in analogy with Galilean principle of relativity is set. Equivalence principle is set to state that the outcome of non-uniform time in an inertial frame of reference is equivalent to the outcome of a fictitious gravity force external to the frame of reference. Thus it is flow of time that causes gravity rather than mass. The latter is compared to experimental data achieving precision of up to 0.0003%. It is shown that the law of energy conservation is inapplicable to the frames of non-uniform time. A theoretical model of a physical entity (point mass, photon) travelling in the field of non-uniform time is considered. A generalized law that allows the flow of time to replace classical energy conservation is introduced on the basis of the experiment of Pound and Rebka. It is shown that linear dependence of flow of time on spatial coordinate conforms the inverse square law of universal gravitation and Keplerian mechanics. Momentum is shown to still be conserved
Standard Model theory calculations and experimental tests
International Nuclear Information System (INIS)
Cacciari, M.; Hamel de Monchenault, G.
2015-01-01
To present knowledge, all the physics at the Large Hadron Collider (LHC) can be described in the framework of the Standard Model (SM) of particle physics. Indeed the newly discovered Higgs boson with a mass close to 125 GeV seems to confirm the predictions of the SM. Thus, besides looking for direct manifestations of the physics beyond the SM, one of the primary missions of the LHC is to perform ever more stringent tests of the SM. This requires not only improved theoretical developments to produce testable predictions and provide experiments with reliable event generators, but also sophisticated analyses techniques to overcome the formidable experimental environment of the LHC and perform precision measurements. In the first section, we describe the state of the art of the theoretical tools and event generators that are used to provide predictions for the production cross sections of the processes of interest. In section 2, inclusive cross section measurements with jets, leptons and vector bosons are presented. Examples of differential cross sections, charge asymmetries and the study of lepton pairs are proposed in section 3. Finally, in section 4, we report studies on the multiple production of gauge bosons and constraints on anomalous gauge couplings
Models with oscillator terms in noncommutative quantum field theory
International Nuclear Information System (INIS)
Kronberger, E.
2010-01-01
The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de
Implications of Information Theory for Computational Modeling of Schizophrenia.
Silverstein, Steven M; Wibral, Michael; Phillips, William A
2017-10-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.
Reservoir theory, groundwater transit time distributions, and lumped parameter models
International Nuclear Information System (INIS)
Etcheverry, D.; Perrochet, P.
1999-01-01
The relation between groundwater residence times and transit times is given by the reservoir theory. It allows to calculate theoretical transit time distributions in a deterministic way, analytically, or on numerical models. Two analytical solutions validates the piston flow and the exponential model for simple conceptual flow systems. A numerical solution of a hypothetical regional groundwater flow shows that lumped parameter models could be applied in some cases to large-scale, heterogeneous aquifers. (author)
Theory of compressive modeling and simulation
Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith
2013-05-01
Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .
Consistent constraints on the Standard Model Effective Field Theory
International Nuclear Information System (INIS)
Berthier, Laure; Trott, Michael
2016-01-01
We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.
Effective potential in Lorentz-breaking field theory models
Energy Technology Data Exchange (ETDEWEB)
Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)
2017-12-15
We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)
Lenses on Reading An Introduction to Theories and Models
Tracey, Diane H
2012-01-01
This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition
Effective potential in Lorentz-breaking field theory models
International Nuclear Information System (INIS)
Baeta Scarpelli, A.P.; Brito, L.C.T.; Felipe, J.C.C.; Nascimento, J.R.; Petrov, A.Yu.
2017-01-01
We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)
A signal detection-item response theory model for evaluating neuropsychological measures.
Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G
2018-02-05
Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the
Educating Occupational Therapists in the Use of Theory and Evidence to Enhance Supervision Practice
Directory of Open Access Journals (Sweden)
Melanie J. Roberts
2017-10-01
Full Text Available This paper describes the implementation of a unique learning experience aimed at enhancing the quality of supervision practice in occupational therapy at the Gold Coast Hospital and Health Service. The package was designed by experienced occupational therapy educators based on adult, blended, and flipped learning approaches with content developed following administration of a standardized tool and semi-structured interviews. The learning package focused particularly on the logistics of supervision and the use of occupational therapy theory and evidence with supervision. The training for supervising therapists included a workshop and pre and post workshop learning activities. This collaborative research approach to designing and implementing a learning package as well as the specific content of the ongoing education opportunities could also be transferred to other services.
Theory of own mind in autism: Evidence of a specific deficit in self-awareness?
Williams, David
2010-09-01
Assuming that self-awareness is not a unitary phenomenon, and that one can be aware of different aspects of self at any one time, it follows that selective impairments in self-awareness can occur. This article explores the idea that autism involves a particular deficit in awareness of the 'psychological self', or 'theory of own mind'. This hypothesised deficit renders individuals with autism spectrum disorder (ASD) at least as impaired at recognising their own mental states as at recognising mental states in other people. This deficit, it is argued, stands in contrast to an apparently typical awareness of the 'physical self' amongst people with autism. Theoretical implications of the empirical evidence are discussed.
Directory of Open Access Journals (Sweden)
Francesca Zazzara
2013-11-01
Full Text Available On June 3rd 2013, in Turin, Italy, the Swiss industrialist Schmidheiny has been sentenced to 18 years imprisonment for intentional disaster for 3,000 asbestos-linked tumours in Italian workers at cement multinational Eternit. The indiscriminate use of asbestos, however, continues worldwide. Although many studies have shown that asbestos is associated with an increased risk of mortality and morbidity, denial theories were spread over time, showing how the logic of profit governs the production of asbestos. We examined the history of the epidemiological evidence of asbestos related risks and, second, the main sources of exposure in Italy and in the world, occupational, non-occupational, and post-disaster exposure (as occurred after L’Aquila earthquake in April 2009. The theme of inequality and social justice is ever so alarming in the fight against asbestos and its lobbies.
Inhibitory processes and cognitive flexibility: evidence for the theory of attentional inertia
Directory of Open Access Journals (Sweden)
Isabel Introzzi
2015-07-01
Full Text Available The aim of this study was to discriminate the differential contribution of different inhibitory processes -perceptual, cognitive and behavioral inhibition- to switching cost effect associated with alternation cognitive tasks. A correlational design was used. Several experimental paradigms (e.g., Stop signal, visual search, Stemberg´s experimental and Simon paradigm were adapted and included in a computerized program called TAC (Introzzi & Canet Juric, 2014 to the assessment of the different cognitive processes. The final sample consisted of 45 adults (18-50 years. Perceptual and behavioral inhibition shows moderate and low correlations with attentional cost, cognitive inhibition shows no relation with flexibility and only perceptual inhibition predicts switching costs effects, suggesting that different inhibitory processes contribute differentially to switch cost. This could be interpreted as evidence to Attentional Inertia Theory main argument which postulates that inhibition plays an essential role in the ability to flexibly switch between tasks and/or representations.
Chemolli, Emanuela; Gagné, Marylène
2014-06-01
Self-determination theory (SDT) proposes a multidimensional conceptualization of motivation in which the different regulations are said to fall along a continuum of self-determination. The continuum has been used as a basis for using a relative autonomy index as a means to create motivational scores. Rasch analysis was used to verify the continuum structure of the Multidimensional Work Motivation Scale and of the Academic Motivation Scale. We discuss the concept of continuum against SDT's conceptualization of motivation and argue against the use of the relative autonomy index on the grounds that evidence for a continuum structure underlying the regulations is weak and because the index is statistically problematic. We suggest exploiting the full richness of SDT's multidimensional conceptualization of motivation through the use of alternative scoring methods when investigating motivational dynamics across life domains.
Kumar, S Santhosh; Shankaranarayanan, S
2017-11-17
In a bipartite set-up, the vacuum state of a free Bosonic scalar field is entangled in real space and satisfies the area-law- entanglement entropy scales linearly with area of the boundary between the two partitions. In this work, we show that the area law is violated in two spatial dimensional model Hamiltonian having dynamical critical exponent z = 3. The model physically corresponds to next-to-next-to-next nearest neighbour coupling terms on a lattice. The result reported here is the first of its kind of violation of area law in Bosonic systems in higher dimensions and signals the evidence of a quantum phase transition. We provide evidence for quantum phase transition both numerically and analytically using quantum Information tools like entanglement spectra, quantum fidelity, and gap in the energy spectra. We identify the cause for this transition due to the accumulation of large number of angular zero modes around the critical point which catalyses the change in the ground state wave function due to the next-to-next-to-next nearest neighbor coupling. Lastly, using Hubbard-Stratanovich transformation, we show that the effective Bosonic Hamiltonian can be obtained from an interacting fermionic theory and provide possible implications for condensed matter systems.
Integrable models in 1+1 dimensional quantum field theory
International Nuclear Information System (INIS)
Faddeev, Ludvig.
1982-09-01
The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR
A model of PCF in guarded type theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...
A Model of PCF in Guarded Type Theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...
An introduction to queueing theory modeling and analysis in applications
Bhat, U Narayan
2015-01-01
This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...
Adapting evidence-based interventions using a common theory, practices, and principles.
Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D
2014-01-01
Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed.
A new non-specificity measure in evidence theory based on belief intervals
Institute of Scientific and Technical Information of China (English)
Yang Yi; Han Deqiang; Jean Dezert
2016-01-01
In the theory of belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity. For the non-specificity part, some traditional measures use for reference the Hartley measure in classical set theory;other traditional measures use the simple and heuristic function for joint use of mass assignments and the cardinality of focal elements. In this paper, a new non-specificity measure is proposed using lengths of belief intervals, which represent the degree of imprecision. Therefore, it has more intuitive physical meaning. It can be proved that our new measure can be rewritten in a general form for the non-specificity. Our new measure is also proved to be a strict non-specificity measure with some desired properties. Numerical examples, simulations, the related analyses and proofs are provided to show the characteristics and good properties of the new non-specificity definition. An example of an application of the new non-specificity measure is also presented.
Location Decisions of U.S. Polluting Plants. Theory, Empirical Evidence, and Consequences
International Nuclear Information System (INIS)
Shadbegian, R.; Wolverton, A.
2010-01-01
Economists have long been interested in explaining the spatial distribution of economic activity, focusing on what factors motivate profit-maximizing firms when they choose to open a new plant or expand an existing facility. We begin our paper with a general discussion of the theory of plant location, including the role of taxes and agglomeration economies. However, our paper focuses on the theory, evidence, and implications of the role of environmental regulations in plant location decisions. On its face, environmental regulation would not necessarily be expected to alter location decisions, since we would expect Federal regulation to affect all locations in the United States essentially equally. It turns out, however, that this is not always the case as some geographic areas are subject to greater stringency. Another source of variation is differences across states in the way they implement and enforce compliance with Federal regulation. In light of these spatial differences in the costs of complying with environmental regulations, we discuss three main questions in this survey: Do environmental regulations affect the location decisions of polluting plants? Do states compete for polluting plants through differences in environmental regulation? And, do firms locate polluting plants disproportionately near poor and minority neighborhoods?.
Mothersill, Omar; Tangney, Noreen; Morris, Derek W; McCarthy, Hazel; Frodl, Thomas; Gill, Michael; Corvin, Aiden; Donohoe, Gary
2017-06-01
Resting-state functional magnetic resonance imaging (rs-fMRI) has repeatedly shown evidence of altered functional connectivity of large-scale networks in schizophrenia. The relationship between these connectivity changes and behaviour (e.g. symptoms, neuropsychological performance) remains unclear. Functional connectivity in 27 patients with schizophrenia or schizoaffective disorder, and 25 age and gender matched healthy controls was examined using rs-fMRI. Based on seed regions from previous studies, we examined functional connectivity of the default, cognitive control, affective and attention networks. Effects of symptom severity and theory of mind performance on functional connectivity were also examined. Patients showed increased connectivity between key nodes of the default network including the precuneus and medial prefrontal cortex compared to controls (pmind performance were both associated with altered connectivity of default regions within the patient group (pmind performance. Extending these findings by examining the effects of emerging social cognition treatments on both default connectivity and theory of mind performance is now an important goal for research. Copyright © 2016 Elsevier B.V. All rights reserved.
Location Decisions of U.S. Polluting Plants. Theory, Empirical Evidence, and Consequences
Energy Technology Data Exchange (ETDEWEB)
Shadbegian, R.; Wolverton, A.
2010-06-15
Economists have long been interested in explaining the spatial distribution of economic activity, focusing on what factors motivate profit-maximizing firms when they choose to open a new plant or expand an existing facility. We begin our paper with a general discussion of the theory of plant location, including the role of taxes and agglomeration economies. However, our paper focuses on the theory, evidence, and implications of the role of environmental regulations in plant location decisions. On its face, environmental regulation would not necessarily be expected to alter location decisions, since we would expect Federal regulation to affect all locations in the United States essentially equally. It turns out, however, that this is not always the case as some geographic areas are subject to greater stringency. Another source of variation is differences across states in the way they implement and enforce compliance with Federal regulation. In light of these spatial differences in the costs of complying with environmental regulations, we discuss three main questions in this survey: Do environmental regulations affect the location decisions of polluting plants? Do states compete for polluting plants through differences in environmental regulation? And, do firms locate polluting plants disproportionately near poor and minority neighborhoods?.
Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping.
Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E
2014-04-01
Fear arousal-vividly showing people the negative health consequences of life-endangering behaviors-is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. © 2013 The Authors. International Journal of Psychology published by John Wiley © Sons Ltd on behalf of International Union of Psychological Science.
Poisson-Boltzmann theory of charged colloids: limits of the cell model for salty suspensions
International Nuclear Information System (INIS)
Denton, A R
2010-01-01
Thermodynamic properties of charge-stabilized colloidal suspensions and polyelectrolyte solutions are commonly modelled by implementing the mean-field Poisson-Boltzmann (PB) theory within a cell model. This approach models a bulk system by a single macroion, together with counterions and salt ions, confined to a symmetrically shaped, electroneutral cell. While easing numerical solution of the nonlinear PB equation, the cell model neglects microion-induced interactions and correlations between macroions, precluding modelling of macroion ordering phenomena. An alternative approach, which avoids the artificial constraints of cell geometry, exploits the mapping of a macroion-microion mixture onto a one-component model of pseudo-macroions governed by effective interparticle interactions. In practice, effective-interaction models are usually based on linear-screening approximations, which can accurately describe strong nonlinear screening only by incorporating an effective (renormalized) macroion charge. Combining charge renormalization and linearized PB theories, in both the cell model and an effective-interaction (cell-free) model, we compute osmotic pressures of highly charged colloids and monovalent microions, in Donnan equilibrium with a salt reservoir, over a range of concentrations. By comparing predictions with primitive model simulation data for salt-free suspensions, and with predictions from nonlinear PB theory for salty suspensions, we chart the limits of both the cell model and linear-screening approximations in modelling bulk thermodynamic properties. Up to moderately strong electrostatic couplings, the cell model proves accurate for predicting osmotic pressures of deionized (counterion-dominated) suspensions. With increasing salt concentration, however, the relative contribution of macroion interactions to the osmotic pressure grows, leading predictions from the cell and effective-interaction models to deviate. No evidence is found for a liquid
Traffic Games: Modeling Freeway Traffic with Game Theory.
Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.
Comparison of potential models through heavy quark effective theory
International Nuclear Information System (INIS)
Amundson, J.F.
1995-01-01
I calculate heavy-light decay constants in a nonrelativistic potential model. The resulting estimate of heavy quark symmetry breaking conflicts with similar estimates from lattice QCD. I show that a semirelativistic potential model eliminates the conflict. Using the results of heavy quark effective theory allows me to identify and compensate for shortcomings in the model calculations in addition to isolating the source of the differences in the two models. The results lead to a rule as to where the nonrelativistic quark model gives misleading predictions
International Nuclear Information System (INIS)
Guendelman, E.
2004-01-01
Full Text:The Volume Element of Space Time can be considered as a geometrical object which can be independent of the metric. The use in the action of a volume element which is metric independent leads to the appearance of a measure of integration which is metric independent. This can be applied to all known generally coordinate invariant theories, we will discuss three very important cases: 1. 4-D theories describing gravity and matter fields, 2. Parametrization invariant theories of extended objects and 3. Higher dimensional theories including gravity and matter fields. In case 1, a large number of new effects appear: (i) spontaneous breaking of scale invariance associated to integration of degrees of freedom related to the measure, (ii) under normal particle physics laboratory conditions fermions split into three families, but when matter is highly diluted, neutrinos increase their mass and become suitable candidates for dark matter, (iii) cosmic coincidence between dark energy and dark matter is natural, (iv) quintessence scenarios with automatic decoupling of the quintessence scalar to ordinary matter, but not dark matter are obtained (2) For theories or extended objects, the use of a measure of integration independent of the metric leads to (i) dynamical tension, (ii) string models of non abelian confinement (iii) The possibility of new Weyl invariant light-like branes (WTT.L branes). These Will branes dynamically adjust themselves to sit at black hole horizons and in the context of higher dimensional theories can provide examples of massless 4-D particles with nontrivial Kaluza Klein quantum numbers, (3) In Bronx and Kaluza Klein scenarios, the use of a measure independent of the metric makes it possible to construct naturally models where only the extra dimensions get curved and the 4-D observable space-time remain flat
Theory of positive disintegration as a model of adolescent development.
Laycraft, Krystyna
2011-01-01
This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.
Should Unemployment Insurance Vary with the Unemployment Rate? Theory and Evidence
Kroft, Kory; Notowidigdo, Matthew J.
2012-01-01
We study how optimal unemployment insurance (UI) benefits vary over the business cycle by estimating how the moral hazard cost and the consumption smoothing benefit of UI vary with the unemployment rate. We find that the moral hazard cost is procyclical, greater when the unemployment rate is relatively low. By contrast, our evidence suggests that the consumption smoothing benefit of UI is acyclical. Using these estimates to calibrate our job search model, we find that a one standard deviation...
Tsai, Chung-Hung
2014-05-07
Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Directory of Open Access Journals (Sweden)
Chung-Hung Tsai
2014-05-01
Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Modelling machine ensembles with discrete event dynamical system theory
Hunter, Dan
1990-01-01
Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).
Theory, modeling, and integrated studies in the Arase (ERG) project
Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa
2018-02-01
Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.
Spectral and scattering theory for translation invariant models in quantum field theory
DEFF Research Database (Denmark)
Rasmussen, Morten Grud
This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...
Cirafici, M.; Sinkovics, A.; Szabo, R.J.
2009-01-01
We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques
Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions
Directory of Open Access Journals (Sweden)
Camaren Peter
2014-03-01
Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.
Excellence in Physics Education Award: Modeling Theory for Physics Instruction
Hestenes, David
2014-03-01
All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.
A Model of Statistics Performance Based on Achievement Goal Theory.
Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.
2003-01-01
Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…
Anisotropic cosmological models and generalized scalar tensor theory
Indian Academy of Sciences (India)
Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...
Anisotropic cosmological models and generalized scalar tensor theory
Indian Academy of Sciences (India)
In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous ﬂuid, both exponential and power-law solutions have been studied and some assumptions among the ...
Two-dimensional models in statistical mechanics and field theory
International Nuclear Information System (INIS)
Koberle, R.
1980-01-01
Several features of two-dimensional models in statistical mechanics and Field theory, such as, lattice quantum chromodynamics, Z(N), Gross-Neveu and CP N-1 are discussed. The problems of confinement and dynamical mass generation are also analyzed. (L.C.) [pt
The early years of string theory: The dual resonance model
International Nuclear Information System (INIS)
Ramond, P.
1987-10-01
This paper reviews the past quantum mechanical history of the dual resonance model which is an early string theory. The content of this paper is listed as follows: historical review, the Veneziano amplitude, the operator formalism, the ghost story, and the string story
Interacting bosons model and relation with BCS theory
International Nuclear Information System (INIS)
Diniz, R.
1990-01-01
The Nambu mechanism for BCS theory is extended with inclusion of quadrupole pairing in addition to the usual monopole pairing. An effective Hamiltonian is constructed and its relation to the IBM is discussed. The faced difficulties and a possible generalization of this model are discussed. (author)
Symmetry-guided large-scale shell-model theory
Czech Academy of Sciences Publication Activity Database
Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.
2016-01-01
Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell -model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016
The Five-Factor Model and Self-Determination Theory
DEFF Research Database (Denmark)
Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette
This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...
A Proposed Model of Jazz Theory Knowledge Acquisition
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
S matrix theory of the massive Thirring model
International Nuclear Information System (INIS)
Berg, B.
1980-01-01
The S matrix theory of the massive Thirring model, describing the exact quantum scattering of solitons and their boundstates, is reviewed. Treated are: Factorization equations and their solution, boundstates, generalized Jost functions and Levinson's theorem, scattering of boundstates, 'virtual' and anomalous thresholds. (orig.) 891 HSI/orig. 892 MKO
Using SAS PROC MCMC for Item Response Theory Models
Ames, Allison J.; Samonte, Kelli
2015-01-01
Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…
Multilevel Higher-Order Item Response Theory Models
Huang, Hung-Yu; Wang, Wen-Chung
2014-01-01
In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…
Item Response Theory Models for Performance Decline during Testing
Jin, Kuan-Yu; Wang, Wen-Chung
2014-01-01
Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…
Item Response Theory Modeling of the Philadelphia Naming Test
Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D.
2015-01-01
Purpose: In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating…
An NCME Instructional Module on Polytomous Item Response Theory Models
Penfield, Randall David
2014-01-01
A polytomous item is one for which the responses are scored according to three or more categories. Given the increasing use of polytomous items in assessment practices, item response theory (IRT) models specialized for polytomous items are becoming increasingly common. The purpose of this ITEMS module is to provide an accessible overview of…
Profiles in Leadership: Enhancing Learning through Model and Theory Building.
Mello, Jeffrey A.
2003-01-01
A class assignment was designed to present factors affecting leadership dynamics, allow practice in model and theory building, and examine leadership from multicultural perspectives. Students developed a profile of a fictional or real leader and analyzed qualities, motivations, context, and effectiveness in written and oral presentations.…
Evaluating hydrological model performance using information theory-based metrics
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Stochastic models in risk theory and management accounting
Brekelmans, R.C.M.
2000-01-01
This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest
Conformal field theories, Coulomb gas picture and integrable models
International Nuclear Information System (INIS)
Zuber, J.B.
1988-01-01
The aim of the study is to present the links between some results of conformal field theory, the conventional Coulomb gas picture in statistical mechanics and the approach of integrable models. It is shown that families of conformal theories, related by the coset construction to the SU(2) Kac-Moody algebra, may be regarded as obtained from some free field, and modified by the coupling of its winding numbers to floating charges. This representation reflects the procedure of restriction of the corresponding integrable lattice models. The work may be generalized to models based on the coset construction with higher rank algebras. The corresponding integrable models are identified. In the conformal field description, generalized parafermions appear, and are coupled to free fields living on a higher-dimensional torus. The analysis is not as exhaustive as in the SU(2) case: all the various restrictions have not been identified, nor the modular invariants completely classified
Route Choice Model Based on Game Theory for Commuters
Directory of Open Access Journals (Sweden)
Licai Yang
2016-06-01
Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.
Incorporating Contagion in Portfolio Credit Risk Models Using Network Theory
Anagnostou, I.; Sourabh, S.; Kandhai, D.
2018-01-01
Portfolio credit risk models estimate the range of potential losses due to defaults or deteriorations in credit quality. Most of these models perceive default correlation as fully captured by the dependence on a set of common underlying risk factors. In light of empirical evidence, the ability of
Directory of Open Access Journals (Sweden)
Reeves Scott
2006-02-01
Full Text Available Abstract The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.
Modeling Composite Assessment Data Using Item Response Theory
Ueckert, Sebastian
2018-01-01
Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119
Constitutive relationships and models in continuum theories of multiphase flows
International Nuclear Information System (INIS)
Decker, R.
1989-09-01
In April, 1989, a workshop on constitutive relationships and models in continuum theories of multiphase flows was held at NASA's Marshall Space Flight Center. Topics of constitutive relationships for the partial or per phase stresses, including the concept of solid phase pressure are discussed. Models used for the exchange of mass, momentum, and energy between the phases in a multiphase flow are also discussed. The program, abstracts, and texts of the presentations from the workshop are included
Perturbation theory around the Wess-Zumino-Witten model
International Nuclear Information System (INIS)
Hasseln, H. v.
1991-05-01
We consider a perturbation of the Wess-Zumino-Witten model in 2D by a current-current interaction. The β-function is computed to third order in the coupling constant and a nontrivial fixedpoint is found. By non-abelian bosonization, this perturbed WZW-model is shown to have the same β-function (at least to order g 2 ) as the fermionic theory with a four-fermion interaction. (orig.) [de
Flipped classroom model for learning evidence-based medicine.
Rucker, Sydney Y; Ozdogan, Zulfukar; Al Achkar, Morhaf
2017-01-01
Journal club (JC), as a pedagogical strategy, has long been used in graduate medical education (GME). As evidence-based medicine (EBM) becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice.
Educational Program Evaluation Model, From the Perspective of the New Theories
Directory of Open Access Journals (Sweden)
Soleiman Ahmady
2014-05-01
Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.
Directory of Open Access Journals (Sweden)
Fuyuan Xiao
2017-10-01
Full Text Available The multi-sensor data fusion technique plays a significant role in fault diagnosis and in a variety of such applications, and the Dempster–Shafer evidence theory is employed to improve the system performance; whereas, it may generate a counter-intuitive result when the pieces of evidence highly conflict with each other. To handle this problem, a novel multi-sensor data fusion approach on the basis of the distance of evidence, belief entropy and fuzzy preference relation analysis is proposed. A function of evidence distance is first leveraged to measure the conflict degree among the pieces of evidence; thus, the support degree can be obtained to represent the reliability of the evidence. Next, the uncertainty of each piece of evidence is measured by means of the belief entropy. Based on the quantitative uncertainty measured above, the fuzzy preference relations are applied to represent the relative credibility preference of the evidence. Afterwards, the support degree of each piece of evidence is adjusted by taking advantage of the relative credibility preference of the evidence that can be utilized to generate an appropriate weight with respect to each piece of evidence. Finally, the modified weights of the evidence are adopted to adjust the bodies of the evidence in the advance of utilizing Dempster’s combination rule. A numerical example and a practical application in fault diagnosis are used as illustrations to demonstrate that the proposal is reasonable and efficient in the management of conflict and fault diagnosis.
A general-model-space diagrammatic perturbation theory
International Nuclear Information System (INIS)
Hose, G.; Kaldor, U.
1980-01-01
A diagrammatic many-body perturbation theory applicable to arbitrary model spaces is presented. The necessity of having a complete model space (all possible occupancies of the partially-filled shells) is avoided. This requirement may be troublesome for systems with several well-spaced open shells, such as most atomic and molecular excited states, as a complete model space spans a very broad energy range and leaves out states within that range, leading to poor or no convergence of the perturbation series. The method presented here would be particularly useful for such states. The solution of a model problem (He 2 excited Σ + sub(g) states) is demonstrated. (Auth.)
Theory-based Bayesian models of inductive learning and reasoning.
Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles
2006-07-01
Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.
Fluid analog model for boundary effects in field theory
International Nuclear Information System (INIS)
Ford, L. H.; Svaiter, N. F.
2009-01-01
Quantum fluctuations in the density of a fluid with a linear phonon dispersion relation are studied. In particular, we treat the changes in these fluctuations due to nonclassical states of phonons and to the presence of boundaries. These effects are analogous to similar effects in relativistic quantum field theory, and we argue that the case of the fluid is a useful analog model for effects in field theory. We further argue that the changes in the mean squared density are, in principle, observable by light scattering experiments.
Finite-size scaling theory and quantum hamiltonian Field theory: the transverse Ising model
International Nuclear Information System (INIS)
Hamer, C.J.; Barber, M.N.
1979-01-01
Exact results for the mass gap, specific heat and susceptibility of the one-dimensional transverse Ising model on a finite lattice are generated by constructing a finite matrix representation of the Hamiltonian using strong-coupling eigenstates. The critical behaviour of the limiting infinite chain is analysed using finite-size scaling theory. In this way, excellent estimates (to within 1/2% accuracy) are found for the critical coupling and the exponents α, ν and γ
A General Framework for Portfolio Theory. Part I: theory and various models
Maier-Paape, Stanislaus; Zhu, Qiji Jim
2017-01-01
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz [H. Markowitz, Portfolio Selection, 1959] and its natural generalization, the capital market pricing model, [W. F. Sharpe, Mutual fund performance , 1966] are spe...
Latent factor modeling of four schizotypy dimensions with theory of mind and empathy.
Directory of Open Access Journals (Sweden)
Jeffrey S Bedwell
Full Text Available Preliminary evidence suggests that theory of mind and empathy relate differentially to factors of schizotypy. The current study assessed 686 undergraduate students and used structural equation modeling to examine links between a four-factor model of schizotypy with performance on measures of theory of mind (Reading the Mind in the Eyes Test [MIE] and empathy (Interpersonal Reactivity Index [IRI]. Schizotypy was assessed using three self-report measures which were simultaneously entered into the model. Results revealed that the Negative factor of schizotypy showed a negative relationship with the Empathy factor, which was primarily driven by the Empathic Concern subscale of the IRI and the No Close Friends and Constricted Affect subscales of the Schizotypal Personality Questionnaire. These findings are consistent with a growing body of literature suggesting a relatively specific relationship between negative schizotypy and empathy, and are consistent with several previous studies that found no relationship between MIE performance and schizotypy.
Should the model for risk-informed regulation be game theory rather than decision theory?
Bier, Vicki M; Lin, Shi-Woei
2013-02-01
deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.
sigma model approach to the heterotic string theory
International Nuclear Information System (INIS)
Sen, A.
1985-09-01
Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs
Integrable lambda models and Chern-Simons theories
International Nuclear Information System (INIS)
Schmidtt, David M.
2017-01-01
In this note we reveal a connection between the phase space of lambda models on S 1 ×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 ×S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.
Models for probability and statistical inference theory and applications
Stapleton, James H
2007-01-01
This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...
Integrable lambda models and Chern-Simons theories
Energy Technology Data Exchange (ETDEWEB)
Schmidtt, David M. [Departamento de Física, Universidade Federal de São Carlos,Caixa Postal 676, CEP 13565-905, São Carlos-SP (Brazil)
2017-05-03
In this note we reveal a connection between the phase space of lambda models on S{sup 1}×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS{sub 5}×S{sup 5} lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.
Classical nucleation theory in the phase-field crystal model.
Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas
2018-04-01
A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.
Matrix models and stochastic growth in Donaldson-Thomas theory
Energy Technology Data Exchange (ETDEWEB)
Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)
2012-10-15
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Matrix models and stochastic growth in Donaldson-Thomas theory
International Nuclear Information System (INIS)
Szabo, Richard J.; Tierz, Miguel
2012-01-01
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Forewarning model for water pollution risk based on Bayes theory.
Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis
2014-02-01
In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.
Soliton excitations in polyacetylene and relativistic field theory models
International Nuclear Information System (INIS)
Campbell, D.K.; Bishop, A.R.; Los Alamos Scientific Lab., NM
1982-01-01
A continuum model of a Peierls-dimerized chain, as described generally by Brazovskii and discussed for the case of polyacetylene by Takayama, Lin-Liu and Maki (TLM), is considered. The continuum (Bogliubov-de Gennes) equations arising in this model of interacting electrons and phonons are shown to be equivalent to the static, semiclassical equations for a solvable model field theory of self-coupled fermions - the N = 2 Gross-Neveu model. Based on this equivalence we note the existence of soliton defect states in polyacetylene that are additional to, and qualitatively different from, the amplitude kinks commonly discussed. The new solutions do not have the topological stability of kinks but are essentially conventional strong-coupling polarons in the dimerized chain. They carry spin (1/2) and charge (+- e). In addition, we discuss further areas in which known field theory results may apply to a Peierls-dimerized chain, including relations between phenomenological PHI 4 and continuuum electron-phonon models, and the structure of the fully quantum versus mean field theories. (orig.)
Classical nucleation theory in the phase-field crystal model
Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas
2018-04-01
A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.
Physics of human cooperation: experimental evidence and theoretical models
Sánchez, Angel
2018-02-01
In recent years, many physicists have used evolutionary game theory combined with a complex systems perspective in an attempt to understand social phenomena and challenges. Prominent among such phenomena is the issue of the emergence and sustainability of cooperation in a networked world of selfish or self-focused individuals. The vast majority of research done by physicists on these questions is theoretical, and is almost always posed in terms of agent-based models. Unfortunately, more often than not such models ignore a number of facts that are well established experimentally, and are thus rendered irrelevant to actual social applications. I here summarize some of the facts that any realistic model should incorporate and take into account, discuss important aspects underlying the relation between theory and experiments, and discuss future directions for research based on the available experimental knowledge.
Directory of Open Access Journals (Sweden)
Carol A. Gordon
2009-09-01
Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action
Theory and theory-based models for the pedestal, edge stability and ELMs in tokamaks
International Nuclear Information System (INIS)
Guzdar, P.N.; Mahajan, S.M.; Yoshida, Z.; Dorland, W.; Rogers, B.N.; Bateman, G.; Kritz, A.H.; Pankin, A.; Voitsekhovitch, I.; Onjun, T.; Snyder, S.
2005-01-01
Theories for equilibrium and stability of H-modes, and models for use within integrated modeling codes with the objective of predicting the height, width and shape of the pedestal at the edge of H-mode plasmas in tokamaks, as well as the onset and frequency of Edge Localized Modes (ELMs), are developed. A theory model for relaxed plasma states with flow, which uses two-fluid Hall-MHD equations, predicts that the natural scale length of the pedestal is the ion skin depth and the pedestal width is larger than the ion poloidal gyro-radius, in agreement with experimental observations. Computations with the GS2 code are used to identify micro-instabilities, such as electron drift waves, that survive the strong flow shear, diamagnetic flows, and magnetic shear that are characteristic of the pedestal. Other instabilities on the pedestal and gyro-radius scale, such as the Kelvin-Helmholtz instability, are also investigated. Time-dependent integrated modeling simulations are used to follow the transition from L-mode to H-mode and the subsequent evolution of ELMs as the heating power is increased. The flow shear stabilization that produces the transport barrier at the edge of the plasma reduces different modes of anomalous transport and, consequently, different channels of transport at different rates. ELM crashes are triggered in the model by pressure-driven ballooning modes or by current-driven peeling modes. (author)
Grassmann phase space theory and the Jaynes–Cummings model
International Nuclear Information System (INIS)
Dalton, B.J.; Garraway, B.M.; Jeffers, J.; Barnett, S.M.
2013-01-01
The Jaynes–Cummings model of a two-level atom in a single mode cavity is of fundamental importance both in quantum optics and in quantum physics generally, involving the interaction of two simple quantum systems—one fermionic system (the TLA), the other bosonic (the cavity mode). Depending on the initial conditions a variety of interesting effects occur, ranging from ongoing oscillations of the atomic population difference at the Rabi frequency when the atom is excited and the cavity is in an n-photon Fock state, to collapses and revivals of these oscillations starting with the atom unexcited and the cavity mode in a coherent state. The observation of revivals for Rydberg atoms in a high-Q microwave cavity is key experimental evidence for quantisation of the EM field. Theoretical treatments of the Jaynes–Cummings model based on expanding the state vector in terms of products of atomic and n-photon states and deriving coupled equations for the amplitudes are a well-known and simple method for determining the effects. In quantum optics however, the behaviour of the bosonic quantum EM field is often treated using phase space methods, where the bosonic mode annihilation and creation operators are represented by c-number phase space variables, with the density operator represented by a distribution function of these variables. Fokker–Planck equations for the distribution function are obtained, and either used directly to determine quantities of experimental interest or used to develop c-number Langevin equations for stochastic versions of the phase space variables from which experimental quantities are obtained as stochastic averages. Phase space methods have also been developed to include atomic systems, with the atomic spin operators being represented by c-number phase space variables, and distribution functions involving these variables and those for any bosonic modes being shown to satisfy Fokker–Planck equations from which c-number Langevin equations are
From 6D superconformal field theories to dynamic gauged linear sigma models
Apruzzi, Fabio; Hassler, Falk; Heckman, Jonathan J.; Melnikov, Ilarion V.
2017-09-01
Compactifications of six-dimensional (6D) superconformal field theories (SCFTs) on four- manifolds generate a large class of novel two-dimensional (2D) quantum field theories. We consider in detail the case of the rank-one simple non-Higgsable cluster 6D SCFTs. On the tensor branch of these theories, the gauge group is simple and there are no matter fields. For compactifications on suitably chosen Kähler surfaces, we present evidence that this provides a method to realize 2D SCFTs with N =(0 ,2 ) supersymmetry. In particular, we find that reduction on the tensor branch of the 6D SCFT yields a description of the same 2D fixed point that is described in the UV by a gauged linear sigma model (GLSM) in which the parameters are promoted to dynamical fields, that is, a "dynamic GLSM" (DGLSM). Consistency of the model requires the DGLSM to be coupled to additional non-Lagrangian sectors obtained from reduction of the antichiral two-form of the 6D theory. These extra sectors include both chiral and antichiral currents, as well as spacetime filling noncritical strings of the 6D theory. For each candidate 2D SCFT, we also extract the left- and right-moving central charges in terms of data of the 6D SCFT and the compactification manifold.
Mulherin, Katrina; Walter, Sheila; Cox, Craig D
2018-03-01
Priority #3 of the Canadian Experiential Education Project for Pharmacy provided evidence-based guidance for the design and implementation of a national approach to preceptor development. In this first article (of three), findings from the project and recommendations to achieve a high-quality preceptor development program (PDP) are presented. A multi-method approach including detailed semi-structured interviews, classic literature review, and advisory committee feedback was employed. The research team performed an integrated analysis of all data to achieve the objectives of Priority #3. Fifteen formal interviews, 167 articles and two stakeholder meetings informed findings. Experiential Education programs exhibited commonality in content and usually delivered programs online using modules or live lectures. Not all programs required preceptor education despite it being mandated by academic accreditors. Academics' perceptions varied regarding pharmacists' baseline knowledge, skills and attitudes prior to engaging in the preceptor role. A national approach to a PDP was desired if jurisdictional content was accommodated. Copious interprofessional literature of generally fair quality did not identify superior preceptor development approaches although there were numerous descriptions of interventions. Only 29 articles measured educational outcomes. Outcomes included satisfaction rates, self-efficacy and perceived knowledge, skill retention, skill implementation and participation rates. Twelve recommendations were identified to guide successful development of a national PDP. In the absence of good evidence, adult educational theory provided a basis for an effective PDP. Findings from Priority #3 may be relevant not only to pharmacy in Canada but other health professions and counterparts in other western nations with similar approaches to professional education. Copyright © 2017 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Jun Zhan
2017-12-01
Full Text Available Unlike the strategy of cognitive regulation that relies heavily on the top-down control function of the prefrontal cortex (PFC, which was recently found may be critically impaired in stressful situations, traditional Chinese philosophy and medicine views different types of emotionality as having mutual promotion and counteraction (MPMC relationships, implying a novel approach that requires less cognition to emotional regulation. Actually, our previous studies have indicated that anger responses could be successfully regulated via the induction of sadness, and this efficiency could not be influenced by stress, thus providing evidences for the hypothesis of “sadness counteracts anger” (SCA proposed by the MPMC theory of emotionality (Zhan et al., 2015, 2017. In this study, we experimentally examined the MPMC hypothesis that “anger counteracts rumination” (ACR which postulates that rumination may be alleviated by the anger emotion. In Study 1, all participants were initially caused state rumination and then induced anger, joy or neutral mood, the results showed that the rumination-related affect was alleviated after anger induction relative to that after joy or neutral mood induction. In Study 2, female participants with high trait rumination were recruited and divided into two groups for exposure to an anger or neutral emotion intervention, the result indicated that the anger intervention group exhibited a greater decline in trait rumination than the neutral emotion intervention group. These findings provided preliminary evidence supporting the hypothesis of ACR, which suggested a new strategy that employs less cognitive resources to regulating state and trait rumination by inducing anger.
Zhan, Jun; Tang, Fan; He, Mei; Fan, Jin; Xiao, Jing; Liu, Chang; Luo, Jing
2017-01-01
Unlike the strategy of cognitive regulation that relies heavily on the top-down control function of the prefrontal cortex (PFC), which was recently found may be critically impaired in stressful situations, traditional Chinese philosophy and medicine views different types of emotionality as having mutual promotion and counteraction (MPMC) relationships, implying a novel approach that requires less cognition to emotional regulation. Actually, our previous studies have indicated that anger responses could be successfully regulated via the induction of sadness, and this efficiency could not be influenced by stress, thus providing evidences for the hypothesis of “sadness counteracts anger” (SCA) proposed by the MPMC theory of emotionality (Zhan et al., 2015, 2017). In this study, we experimentally examined the MPMC hypothesis that “anger counteracts rumination” (ACR) which postulates that rumination may be alleviated by the anger emotion. In Study 1, all participants were initially caused state rumination and then induced anger, joy or neutral mood, the results showed that the rumination-related affect was alleviated after anger induction relative to that after joy or neutral mood induction. In Study 2, female participants with high trait rumination were recruited and divided into two groups for exposure to an anger or neutral emotion intervention, the result indicated that the anger intervention group exhibited a greater decline in trait rumination than the neutral emotion intervention group. These findings provided preliminary evidence supporting the hypothesis of ACR, which suggested a new strategy that employs less cognitive resources to regulating state and trait rumination by inducing anger. PMID:29249998
Hypersurface Homogeneous Cosmological Model in Modified Theory of Gravitation
Katore, S. D.; Hatkar, S. P.; Baxi, R. J.
2016-12-01
We study a hypersurface homogeneous space-time in the framework of the f (R, T) theory of gravitation in the presence of a perfect fluid. Exact solutions of field equations are obtained for exponential and power law volumetric expansions. We also solve the field equations by assuming the proportionality relation between the shear scalar (σ ) and the expansion scalar (θ ). It is observed that in the exponential model, the universe approaches isotropy at large time (late universe). The investigated model is notably accelerating and expanding. The physical and geometrical properties of the investigated model are also discussed.
Categories of relations as models of quantum theory
Directory of Open Access Journals (Sweden)
Chris Heunen
2015-11-01
Full Text Available Categories of relations over a regular category form a family of models of quantum theory. Using regular logic, many properties of relations over sets lift to these models, including the correspondence between Frobenius structures and internal groupoids. Over compact Hausdorff spaces, this lifting gives continuous symmetric encryption. Over a regular Mal'cev category, this correspondence gives a characterization of categories of completely positive maps, enabling the formulation of quantum features. These models are closer to Hilbert spaces than relations over sets in several respects: Heisenberg uncertainty, impossibility of broadcasting, and behavedness of rank one morphisms.
Massive mu pair production in a vector field theory model
Halliday, I G
1976-01-01
Massive electrodynamics is treated as a model for the production of massive mu pairs in high-energy hadronic collisions. The dominant diagrams in perturbation theory are identified and analyzed. These graphs have an eikonal structure which leads to enormous cancellations in the two-particle inclusive cross section but not in the n-particle production cross sections. Under the assumption that these cancellations are complete, a Drell-Yan structure appears in the inclusive cross section but the particles accompanying the mu pairs have a very different structure compared to the parton model. The pionization region is no longer empty of particles as in single parton models. (10 refs).
Supersymmetric sigma models and composite Yang-Mills theory
International Nuclear Information System (INIS)
Lukierski, J.
1980-04-01
We describe two types of supersymmetric sigma models: with field values in supercoset space and with superfields. The notion of Riemannian symmetric pair (H,G/H) is generalized to supergroups. Using the supercoset approach the superconformal-invariant model of composite U(n) Yang-Mills fields in introduced. In the framework of the superfield approach we present with some details two versions of the composite N=1 supersymmetric Yang-Mills theory in four dimensions with U(n) and U(m) x U(n) local invariance. We argue that especially the superfield sigma models can be used for the description of pre-QCD supersymmetric dynamics. (author)
Approximate models for broken clouds in stochastic radiative transfer theory
International Nuclear Information System (INIS)
Doicu, Adrian; Efremenko, Dmitry S.; Loyola, Diego; Trautmann, Thomas
2014-01-01
This paper presents approximate models in stochastic radiative transfer theory. The independent column approximation and its modified version with a solar source computed in a full three-dimensional atmosphere are formulated in a stochastic framework and for arbitrary cloud statistics. The nth-order stochastic models describing the independent column approximations are equivalent to the nth-order stochastic models for the original radiance fields in which the gradient vectors are neglected. Fast approximate models are further derived on the basis of zeroth-order stochastic models and the independent column approximation. The so-called “internal mixing” models assume a combination of the optical properties of the cloud and the clear sky, while the “external mixing” models assume a combination of the radiances corresponding to completely overcast and clear skies. A consistent treatment of internal and external mixing models is provided, and a new parameterization of the closure coefficient in the effective thickness approximation is given. An efficient computation of the closure coefficient for internal mixing models, using a previously derived vector stochastic model as a reference, is also presented. Equipped with appropriate look-up tables for the closure coefficient, these models can easily be integrated into operational trace gas retrieval systems that exploit absorption features in the near-IR solar spectrum. - Highlights: • Independent column approximation in a stochastic setting. • Fast internal and external mixing models for total and diffuse radiances. • Efficient optimization of internal mixing models to match reference models
Kim, Seonah; Robichaud, David J; Beckham, Gregg T; Paton, Robert S; Nimlos, Mark R
2015-04-16
Dehydration over acidic zeolites is an important reaction class for the upgrading of biomass pyrolysis vapors to hydrocarbon fuels or to precursors for myriad chemical products. Here, we examine the dehydration of ethanol at a Brønsted acid site, T12, found in HZSM-5 using density functional theory (DFT). The geometries of both cluster and mixed quantum mechanics/molecular mechanics (QM:MM) models are prepared from the ZSM-5 crystal structure. Comparisons between these models and different DFT methods are conducted to show similar results among the models and methods used. Inclusion of the full catalyst cavity through a QM:MM approach is found to be important, since activation barriers are computed on average as 7 kcal mol(-1) lower than those obtained with a smaller cluster model. Two different pathways, concerted and stepwise, have been considered when examining dehydration and deprotonation steps. The current study shows that a concerted dehydration process is possible with a lower (4-5 kcal mol(-1)) activation barrier while previous literature studies have focused on a stepwise mechanism. Overall, this work demonstrates that fairly high activation energies (∼50 kcal mol(-1)) are required for ethanol dehydration. A concerted mechanism is favored over a stepwise mechanism because charge separation in the transition state is minimized. QM:MM approaches appear to provide superior results to cluster calculations due to a more accurate representation of charges on framework oxygen atoms.
AIC, BIC, Bayesian evidence against the interacting dark energy model
Energy Technology Data Exchange (ETDEWEB)
Szydlowski, Marek [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Krawiec, Adam [Jagiellonian University, Institute of Economics, Finance and Management, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Kurek, Aleksandra [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Kamionka, Michal [University of Wroclaw, Astronomical Institute, Wroclaw (Poland)
2015-01-01
Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative - the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock- Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam's razor we are inclined to reject this model. (orig.)
AIC, BIC, Bayesian evidence against the interacting dark energy model
Energy Technology Data Exchange (ETDEWEB)
Szydłowski, Marek, E-mail: marek.szydlowski@uj.edu.pl [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Krawiec, Adam, E-mail: adam.krawiec@uj.edu.pl [Institute of Economics, Finance and Management, Jagiellonian University, Łojasiewicza 4, 30-348, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Kurek, Aleksandra, E-mail: alex@oa.uj.edu.pl [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Kamionka, Michał, E-mail: kamionka@astro.uni.wroc.pl [Astronomical Institute, University of Wrocław, ul. Kopernika 11, 51-622, Wrocław (Poland)
2015-01-14
Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam’s principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock–Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam’s razor we are inclined to reject this model.
AIC, BIC, Bayesian evidence against the interacting dark energy model
International Nuclear Information System (INIS)
Szydlowski, Marek; Krawiec, Adam; Kurek, Aleksandra; Kamionka, Michal
2015-01-01
Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative - the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock- Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam's razor we are inclined to reject this model. (orig.)
Extensions to a nonlinear finite-element axisymmetric shell model based on Reissner's shell theory
International Nuclear Information System (INIS)
Cook, W.A.
1981-01-01
Extensions to shell analysis not usually associated with shell theory are described in this paper. These extensions involve thick shells, nonlinear materials, a linear normal stress approximation, and a changing shell thickness. A finite element shell-of-revolution model has been developed to analyze nuclear material shipping containers under severe impact conditions. To establish the limits for this shell model, the basic assumptions used in its development were studied; these are listed in this paper. Several extensions were evident from the study of these limits: a thick shell, a plastic hinge, and a linear normal stress
Sensor Data Fusion for Accurate Cloud Presence Prediction Using Dempster-Shafer Evidence Theory
Directory of Open Access Journals (Sweden)
Jesse S. Jin
2010-10-01
Full Text Available Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.
The Function of Gas Vesicles in Halophilic Archaeaand Bacteria: Theories and Experimental Evidence
Oren, Aharon
2012-01-01
A few extremely halophilic Archaea (Halobacterium salinarum, Haloquadratum walsbyi, Haloferax mediterranei, Halorubrum vacuolatum, Halogeometricum borinquense, Haloplanus spp.) possess gas vesicles that bestow buoyancy on the cells. Gas vesicles are also produced by the anaerobic endospore-forming halophilic Bacteria Sporohalobacter lortetii and Orenia sivashensis. We have extensive information on the properties of gas vesicles in Hbt. salinarum and Hfx. mediterranei and the regulation of their formation. Different functions were suggested for gas vesicle synthesis: buoying cells towards oxygen-rich surface layers in hypersaline water bodies to prevent oxygen limitation, reaching higher light intensities for the light-driven proton pump bacteriorhodopsin, positioning the cells optimally for light absorption, light shielding, reducing the cytoplasmic volume leading to a higher surface-area-to-volume ratio (for the Archaea) and dispersal of endospores (for the anaerobic spore-forming Bacteria). Except for Hqr. walsbyi which abounds in saltern crystallizer brines, gas-vacuolate halophiles are not among the dominant life forms in hypersaline environments. There only has been little research on gas vesicles in natural communities of halophilic microorganisms, and the few existing studies failed to provide clear evidence for their possible function. This paper summarizes the current status of the different theories why gas vesicles may provide a selective advantage to some halophilic microorganisms. PMID:25371329
Theory and evidence of economies of scale in the development of waste management systems
International Nuclear Information System (INIS)
Chang, Shoou-Yuh; Rivera, A.L.
1989-01-01
Waste is a cost of doing business. This cost can be considered in terms of the potential adverse health and environmental impacts, or the waste management costs associated with avoiding, minimizing, and controlling those impacts. There is an anticipated increase in the cost of waste management as a result of the increasing requirements for regulatory compliance. To meet the total waste management capacity needs of the organization and the compliance requirements, low-level radioactive, hazardous, and mixed waste management will need demonstrated technologies strategically managed as a technology portfolio. The role of the decision maker is to select the optimum mix of technologies and facilities to provide the waste management capacity needed for the next twenty years. The waste management system resulting from this mix includes multiple small-scale fixed facilities, large-scale centralized facilities, and waste management subcontracts. This study was conducted to examine the theory and evidence of economies of scale in the development of waste management systems as as exploratory research on the economic considerations in the process of technology selection and implementation. 25 refs., 24 figs., 11 tabs
Oliveira, Arnaldo
2007-01-01
This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.
Models, Mechanisms and Moderators Dissociating Empathy and Theory of Mind.
Kanske, Philipp; Böckler, Anne; Singer, Tania
Most instances of social interaction provide a wealth of information about the states of other people, be it sensations, feelings, thoughts, or convictions. How we represent these states has been a major question in social neuroscience, leading to the identification of two routes to understanding others: an affective route for the direct sharing of others' emotions (empathy) that involves, among others, anterior insula and middle anterior cingulate cortex and a cognitive route for representing and reasoning about others' states (Theory of Mind) that entails, among others, ventral temporoparietal junction and anterior and posterior midline regions. Additionally, research has revealed a number of situational and personal factors that shape the functioning of empathy and Theory of Mind. Concerning situational modulators, it has been shown, for instance, that ingroup membership enhances empathic responding and that Theory of Mind performance seems to be susceptible to stress. Personal modulators include psychopathological conditions, for which alterations in empathy and mentalizing have consistently been demonstrated; people on the autism spectrum, for instance, are impaired specifically in mentalizing, while spontaneous empathic responding seems selectively reduced in psychopathy. Given the multifaceted evidence for separability of the two routes, current research endeavors aiming at fostering interpersonal cooperation explore the differential malleability of affective and cognitive understanding of others.
Nonlinear structural mechanics theory, dynamical phenomena and modeling
Lacarbonara, Walter
2013-01-01
Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...
Profile-likelihood Confidence Intervals in Item Response Theory Models.
Chalmers, R Philip; Pek, Jolynn; Liu, Yang
2017-01-01
Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.
The QCD model of hadron cores of the meson theory
International Nuclear Information System (INIS)
Pokrovskii, Y.E.
1985-01-01
It was shown that in the previously proposed QCD model of hadron cores the exchange and self-energy contributions of the virtual quark-antiquark-gluon cloud on the outside of a bag which radius coincides with the hardon core radius of the meson theory (∼ 0.4 Fm) have been taken into account at the phenomenological level. Simulation of this cloud by the meson field results in realistic estimations of the nucleon's electroweak properties, moment fractions carried by gluons, quarks, antiquarks and hadron-hadron interaction cross-sections within a wide range of energies. The authors note that the QCD hadron core model proposed earlier not only realistically reflects the hadron masses, but reflects self-consistently main elements of the structure and interaction of hadrons at the quark-gluon bag radius (R - 0.4Fm) being close to the meson theory core radius
Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang; Hu, Jianjun
2017-07-28
Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster-Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions.
Cahill, James A; Green, Richard E; Fulton, Tara L; Stiller, Mathias; Jay, Flora; Ovsyanikov, Nikita; Salamzade, Rauf; St John, John; Stirling, Ian; Slatkin, Montgomery; Shapiro, Beth
2013-01-01
Despite extensive genetic analysis, the evolutionary relationship between polar bears (Ursus maritimus) and brown bears (U. arctos) remains unclear. The two most recent comprehensive reports indicate a recent divergence with little subsequent admixture or a much more ancient divergence followed by extensive admixture. At the center of this controversy are the Alaskan ABC Islands brown bears that show evidence of shared ancestry with polar bears. We present an analysis of genome-wide sequence data for seven polar bears, one ABC Islands brown bear, one mainland Alaskan brown bear, and a black bear (U. americanus), plus recently published datasets from other bears. Surprisingly, we find clear evidence for gene flow from polar bears into ABC Islands brown bears but no evidence of gene flow from brown bears into polar bears. Importantly, while polar bears contributed bear, they contributed 6.5% of the X chromosome. The magnitude of sex-biased polar bear ancestry and the clear direction of gene flow suggest a model wherein the enigmatic ABC Island brown bears are the descendants of a polar bear population that was gradually converted into brown bears via male-dominated brown bear admixture. We present a model that reconciles heretofore conflicting genetic observations. We posit that the enigmatic ABC Islands brown bears derive from a population of polar bears likely stranded by the receding ice at the end of the last glacial period. Since then, male brown bear migration onto the island has gradually converted these bears into an admixed population whose phenotype and genotype are principally brown bear, except at mtDNA and X-linked loci. This process of genome erosion and conversion may be a common outcome when climate change or other forces cause a population to become isolated and then overrun by species with which it can hybridize.
Directory of Open Access Journals (Sweden)
James A Cahill
Full Text Available Despite extensive genetic analysis, the evolutionary relationship between polar bears (Ursus maritimus and brown bears (U. arctos remains unclear. The two most recent comprehensive reports indicate a recent divergence with little subsequent admixture or a much more ancient divergence followed by extensive admixture. At the center of this controversy are the Alaskan ABC Islands brown bears that show evidence of shared ancestry with polar bears. We present an analysis of genome-wide sequence data for seven polar bears, one ABC Islands brown bear, one mainland Alaskan brown bear, and a black bear (U. americanus, plus recently published datasets from other bears. Surprisingly, we find clear evidence for gene flow from polar bears into ABC Islands brown bears but no evidence of gene flow from brown bears into polar bears. Importantly, while polar bears contributed <1% of the autosomal genome of the ABC Islands brown bear, they contributed 6.5% of the X chromosome. The magnitude of sex-biased polar bear ancestry and the clear direction of gene flow suggest a model wherein the enigmatic ABC Island brown bears are the descendants of a polar bear population that was gradually converted into brown bears via male-dominated brown bear admixture. We present a model that reconciles heretofore conflicting genetic observations. We posit that the enigmatic ABC Islands brown bears derive from a population of polar bears likely stranded by the receding ice at the end of the last glacial period. Since then, male brown bear migration onto the island has gradually converted these bears into an admixed population whose phenotype and genotype are principally brown bear, except at mtDNA and X-linked loci. This process of genome erosion and conversion may be a common outcome when climate change or other forces cause a population to become isolated and then overrun by species with which it can hybridize.
SIMP model at NNLO in chiral perturbation theory
DEFF Research Database (Denmark)
Hansen, Martin Rasmus Lundquist; Langaeble, K.; Sannino, F.
2015-01-01
We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 to 2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles...... with phenomenological constraints challenging the viability of the simplest realisation of the strongly interacting massive particle (SIMP) paradigm....
Regression modeling methods, theory, and computation with SAS
Panik, Michael
2009-01-01
Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,
A model theory for tachyons in two dimensions
International Nuclear Information System (INIS)
Recami, E.; Rodrigues, W.A.
1985-01-01
The paper is divided in two parts, the first one having nothing to do with tachyons. In fact, to prepare the ground, in part one (sect. 2) it is shown that special relativity, even without tachyons, can be given a form such to describe both particles and antiparticles. The plan of part two is confined only to a model theory in two dimensions, for the reasons stated in sect. 3
A realistic model for quantum theory with a locality property
International Nuclear Information System (INIS)
Eberhard, P.H.
1987-04-01
A model reproducing the predictions of relativistic quantum theory to any desired degree of accuracy is described in this paper. It involves quantities that are independent of the observer's knowledge, and therefore can be called real, and which are defined at each point in space, and therefore can be called local in a rudimentary sense. It involves faster-than-light, but not instantaneous, action at distance
Theory, Modeling and Simulation Annual Report 2000; FINAL
International Nuclear Information System (INIS)
Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A
2001-01-01
This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems
Properties of lattice gauge theory models at low temperatures
International Nuclear Information System (INIS)
Mack, G.
1980-01-01
The Z(N) theory of quark confinement is discussed and how fluctuations of Z(N) gauge fields may continue to be important in the continuum limit. Existence of a model in four dimensions is pointed out in which confinement of (scalar) quarks can be shown to persist in the continuum limit. This article is based on the author's Cargese lectures 1979. Some of its results are published here for the first time. (orig.) 891 HSI/orig. 892 MKO
Field theory of large amplitude collective motion. A schematic model
International Nuclear Information System (INIS)
Reinhardt, H.
1978-01-01
By using path integral methods the equation for large amplitude collective motion for a schematic two-level model is derived. The original fermion theory is reformulated in terms of a collective (Bose) field. The classical equation of motion for the collective field coincides with the time-dependent Hartree-Fock equation. Its classical solution is quantized by means of the field-theoretical generalization of the WKB method. (author)
Stability Analysis for Car Following Model Based on Control Theory
International Nuclear Information System (INIS)
Meng Xiang-Pei; Li Zhi-Peng; Ge Hong-Xia
2014-01-01
Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)
Analytical theory of Doppler reflectometry in slab plasma model
Energy Technology Data Exchange (ETDEWEB)
Gusakov, E.Z.; Surkov, A.V. [Ioffe Institute, Politekhnicheskaya 26, St. Petersburg (Russian Federation)
2004-07-01
Doppler reflectometry is considered in slab plasma model in the frameworks of analytical theory. The diagnostics locality is analyzed for both regimes: linear and nonlinear in turbulence amplitude. The toroidal antenna focusing of probing beam to the cut-off is proposed and discussed as a method to increase diagnostics spatial resolution. It is shown that even in the case of nonlinear regime of multiple scattering, the diagnostics can be used for an estimation (with certain accuracy) of plasma poloidal rotation profile. (authors)
Spherically symmetric star model in the gravitational gauge theory
Energy Technology Data Exchange (ETDEWEB)
Tsou, C [Peking Observatory, China; Ch' en, S; Ho, T; Kuo, H
1976-12-01
It is shown that a star model, which is black hole-free and singularity-free, can be obtained naturally in the gravitational gauge theory, provided the space-time is torsion-free and the matter is spinless. The conclusion in a sense shows that the discussions about the black hole and the singularity based on general relativity may not describe nature correctly.
Flipped classroom model for learning evidence-based medicine
Directory of Open Access Journals (Sweden)
Rucker SY
2017-08-01
Full Text Available Sydney Y Rucker,1 Zulfukar Ozdogan,1 Morhaf Al Achkar2 1School of Education, Indiana University, Bloomington, IN, 2Department of Family Medicine, School of Medicine, University of Washington, Seattle, WA, USA Abstract: Journal club (JC, as a pedagogical strategy, has long been used in graduate medical education (GME. As evidence-based medicine (EBM becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice. Keywords: evidence-based medicine, flipped classroom, residency education
Directory of Open Access Journals (Sweden)
Paschoal Tadeu Russo
2012-04-01
Full Text Available The Balanced Scorecard (BSC is a methodology that allows managers to define and implement a set of financial or nonfinancial indicators in a balanced way to assess an organization's performance from four viewpoints. Many companies are unsuccessful in their implementation of the BSC. This lack of success may be attributed to different factors, such as strategic problems, planning failures, and poorly defined targets and goals. However, the failed implementation may be attributed in part to the failure to institutionalize habits and routines. In this regard, this objective of this paper is to use institutional theory to determine whether the book Strategy in Action: Balanced Scorecard contains evidence that the BSC model proposed by the authors (Kaplan & Norton includes elements that favor the model's institutionalization. For this purpose, a qualitative bibliographic survey was prepared. The survey revealed 404 clues that were rated according to Tolbert and Zucker's description of the processes inherent to institutionalization and to Scott's proposed framework of legitimation/legitimizing. These findings suggest that the book primarily legitimizes the BSC by examining organizations and describes it as an acknowledged management instrument. The aspects supporting the semi-institutional stage (26% of the findings and the total institutionalization stage (10% of findings suggest that the authors intended to propose a tool without focusing on the institutionalization process, which may partly explain the great difficulty faced by companies attempting to implement this methodology.
Building Better Ecological Machines: Complexity Theory and Alternative Economic Models
Directory of Open Access Journals (Sweden)
Jess Bier
2016-12-01
Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.
Dentoni, D.; Ross, R.
2013-01-01
Part Two of our Special Issue on wicked problems in agribusiness, “Towards a Theory of Managing Wicked Problems through Multi-Stakeholder Engagements: Evidence from the Agribusiness Sector,” will contribute to four open questions in the broader fields of management and policy: why, when, which and
Neal, Andrew; Kwantes, Peter J
2009-04-01
The aim of this article is to develop a formal model of conflict detection performance. Our model assumes that participants iteratively sample evidence regarding the state of the world and accumulate it over time. A decision is made when the evidence reaches a threshold that changes over time in response to the increasing urgency of the task. Two experiments were conducted to examine the effects of conflict geometry and timing on response proportions and response time. The model is able to predict the observed pattern of response times, including a nonmonotonic relationship between distance at point of closest approach and response time, as well as effects of angle of approach and relative velocity. The results demonstrate that evidence accumulation models provide a good account of performance on a conflict detection task. Evidence accumulation models are a form of dynamic signal detection theory, allowing for the analysis of response times as well as response proportions, and can be used for simulating human performance on dynamic decision tasks.
Noncommutative gauge theory and symmetry breaking in matrix models
International Nuclear Information System (INIS)
Grosse, Harald; Steinacker, Harold; Lizzi, Fedele
2010-01-01
We show how the fields and particles of the standard model can be naturally realized in noncommutative gauge theory. Starting with a Yang-Mills matrix model in more than four dimensions, an SU(n) gauge theory on a Moyal-Weyl space arises with all matter and fields in the adjoint of the gauge group. We show how this gauge symmetry can be broken spontaneously down to SU(3) c xSU(2) L xU(1) Q [resp. SU(3) c xU(1) Q ], which couples appropriately to all fields in the standard model. An additional U(1) B gauge group arises which is anomalous at low energies, while the trace-U(1) sector is understood in terms of emergent gravity. A number of additional fields arise, which we assume to be massive, in a pattern that is reminiscent of supersymmetry. The symmetry breaking might arise via spontaneously generated fuzzy spheres, in which case the mechanism is similar to brane constructions in string theory.
Inadequate Evidence for Multiple Intelligences, Mozart Effect, and Emotional Intelligence Theories
Waterhouse, Lynn
2006-01-01
I (Waterhouse, 2006) argued that, because multiple intelligences, the Mozart effect, and emotional intelligence theories have inadequate empirical support and are not consistent with cognitive neuroscience findings, these theories should not be applied in education. Proponents countered that their theories had sufficient empirical support, were…
Measuring and modeling salience with the theory of visual attention.
Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid
2017-08-01
For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.
Flipped classroom model for learning evidence-based medicine
Rucker, Sydney Y; Ozdogan, Zulfukar; Al Achkar, Morhaf
2017-01-01
Sydney Y Rucker,1 Zulfukar Ozdogan,1 Morhaf Al Achkar2 1School of Education, Indiana University, Bloomington, IN, 2Department of Family Medicine, School of Medicine, University of Washington, Seattle, WA, USA Abstract: Journal club (JC), as a pedagogical strategy, has long been used in graduate medical education (GME). As evidence-based medicine (EBM) becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped cla...
An integrated model of clinical reasoning: dual-process theory of cognition and metacognition.
Marcum, James A
2012-10-01
Clinical reasoning is an important component for providing quality medical care. The aim of the present paper is to develop a model of clinical reasoning that integrates both the non-analytic and analytic processes of cognition, along with metacognition. The dual-process theory of cognition (system 1 non-analytic and system 2 analytic processes) and the metacognition theory are used to develop an integrated model of clinical reasoning. In the proposed model, clinical reasoning begins with system 1 processes in which the clinician assesses a patient's presenting symptoms, as well as other clinical evidence, to arrive at a differential diagnosis. Additional clinical evidence, if necessary, is acquired and analysed utilizing system 2 processes to assess the differential diagnosis, until a clinical decision is made diagnosing the patient's illness and then how best to proceed therapeutically. Importantly, the outcome of these processes feeds back, in terms of metacognition's monitoring function, either to reinforce or to alter cognitive processes, which, in turn, enhances synergistically the clinician's ability to reason quickly and accurately in future consultations. The proposed integrated model has distinct advantages over other models proposed in the literature for explicating clinical reasoning. Moreover, it has important implications for addressing the paradoxical relationship between experience and expertise, as well as for designing a curriculum to teach clinical reasoning skills. © 2012 Blackwell Publishing Ltd.
Application of the evolution theory in modelling of innovation diffusion
Directory of Open Access Journals (Sweden)
Krstić Milan
2016-01-01
Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.
H+3 WZNW model from Liouville field theory
International Nuclear Information System (INIS)
Hikida, Yasuaki; Schomerus, Volker
2007-01-01
There exists an intriguing relation between genus zero correlation functions in the H + 3 WZNW model and in Liouville field theory. We provide a path integral derivation of the correspondence and then use our new approach to generalize the relation to surfaces of arbitrary genus g. In particular we determine the correlation functions of N primary fields in the WZNW model explicitly through Liouville correlators with N+2g-2 additional insertions of certain degenerate fields. The paper concludes with a list of interesting further extensions and a few comments on the relation to the geometric Langlands program
A model for hot electron phenomena: Theory and general results
International Nuclear Information System (INIS)
Carrillo, J.L.; Rodriquez, M.A.
1988-10-01
We propose a model for the description of the hot electron phenomena in semiconductors. Based on this model we are able to reproduce accurately the main characteristics observed in experiments of electric field transport, optical absorption, steady state photoluminescence and relaxation process. Our theory does not contain free nor adjustable parameters, it is very fast computerwise, and incorporates the main collision mechanisms including screening and phonon heating effects. Our description on a set of nonlinear rate equations in which the interactions are represented by coupling coefficients or effective frequencies. We calculate three coefficients from the characteristic constants and the band structure of the material. (author). 22 refs, 5 figs, 1 tab
A possibilistic uncertainty model in classical reliability theory
International Nuclear Information System (INIS)
De Cooman, G.; Capelle, B.
1994-01-01
The authors argue that a possibilistic uncertainty model can be used to represent linguistic uncertainty about the states of a system and of its components. Furthermore, the basic properties of the application of this model to classical reliability theory are studied. The notion of the possibilistic reliability of a system or a component is defined. Based on the concept of a binary structure function, the important notion of a possibilistic function is introduced. It allows to calculate the possibilistic reliability of a system in terms of the possibilistic reliabilities of its components
Theory and Circuit Model for Lossy Coaxial Transmission Line
Energy Technology Data Exchange (ETDEWEB)
Genoni, T. C.; Anderson, C. N.; Clark, R. E.; Gansz-Torres, J.; Rose, D. V.; Welch, Dale Robert
2017-04-01
The theory of signal propagation in lossy coaxial transmission lines is revisited and new approximate analytic formulas for the line impedance and attenuation are derived. The accuracy of these formulas from DC to 100 GHz is demonstrated by comparison to numerical solutions of the exact field equations. Based on this analysis, a new circuit model is described which accurately reproduces the line response over the entire frequency range. Circuit model calculations are in excellent agreement with the numerical and analytic results, and with finite-difference-time-domain simulations which resolve the skindepths of the conducting walls.
Models and applications of chaos theory in modern sciences
Zeraoulia, Elhadj
2011-01-01
This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli
Dube, Chad; Starns, Jeffrey J.; Rotello, Caren M.; Ratcliff, Roger
2012-01-01
A classic question in the recognition memory literature is whether retrieval is best described as a continuous-evidence process consistent with signal detection theory (SDT), or a threshold process consistent with many multinomial processing tree (MPT) models. Because receiver operating characteristics (ROCs) based on confidence ratings are…
Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.
2005-01-01
When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored
Remarks on “A new non-specificity measure in evidence theory based on belief intervals”
Directory of Open Access Journals (Sweden)
Joaquín ABELLÁN
2018-03-01
Full Text Available Two types of uncertainty co-exist in the theory of evidence: discord and non-specificity. From 90s, many mathematical expressions have arisen to quantify these two parts in an evidence. An important aspect of each measure presented is the verification of a coherent set of properties. About non-specificity, so far only one measure verifies an important set of those properties. Very recently, a new measure of non-specificity based on belief intervals has been presented as an alternative measure that quantifies a similar set of properties (Yang et al., 2016. It is shown that the new measure really does not verify two of those important properties. Some errors have been found in their corresponding proofs in the original publication. Keywords: Additivity, Imprecise probabilities, Non-specificity, Subadditivity, Theory of evidence, Uncertainty measures
Refined pipe theory for mechanistic modeling of wood development.
Deckmyn, Gaby; Evans, Sam P; Randle, Tim J
2006-06-01
We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).
Toward a General Research Process for Using Dubin's Theory Building Model
Holton, Elwood F.; Lowe, Janis S.
2007-01-01
Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…
Two problems from the theory of semiotic control models. I. Representations of semiotic models
Energy Technology Data Exchange (ETDEWEB)
Osipov, G S
1981-11-01
Two problems from the theory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of themtheory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of them. Algebraic representation of semiotic models, covering of representations, their reduction and equivalence are discussed. The interrelations between functional and structural characteristics of semiotic models are investigated. 20 references.
O'Campo, Patricia; Urquia, Marcelo
2012-12-01
There is increasing interest in the study of the social determinants of maternal and child health. While there has been growth in the theory and empirical evidence about social determinants, less attention has been paid to the kind of modeling that should be used to understand the impact of social exposures on well-being. We analyzed data from the nationwide 2006 Canadian Maternity Experiences Survey to compare the pervasive disease-specific model to a model that captures the generalized health impact (GHI) of social exposures, namely low socioeconomic position. The GHI model uses a composite of adverse conditions that stem from low socioeconomic position: adverse birth outcomes, postpartum depression, severe abuse, stressful life events, and hospitalization during pregnancy. Adjusted prevalence ratios and 95% confidence intervals from disease-specific models for low income (social determinants of health.
General topology meets model theory, on p and t.
Malliaris, Maryanthe; Shelah, Saharon
2013-08-13
Cantor proved in 1874 [Cantor G (1874) J Reine Angew Math 77:258-262] that the continuum is uncountable, and Hilbert's first problem asks whether it is the smallest uncountable cardinal. A program arose to study cardinal invariants of the continuum, which measure the size of the continuum in various ways. By Gödel [Gödel K (1939) Proc Natl Acad Sci USA 25(4):220-224] and Cohen [Cohen P (1963) Proc Natl Acad Sci USA 50(6):1143-1148], Hilbert's first problem is independent of ZFC (Zermelo-Fraenkel set theory with the axiom of choice). Much work both before and since has been done on inequalities between these cardinal invariants, but some basic questions have remained open despite Cohen's introduction of forcing. The oldest and perhaps most famous of these is whether " p = t," which was proved in a special case by Rothberger [Rothberger F (1948) Fund Math 35:29-46], building on Hausdorff [Hausdorff (1936) Fund Math 26:241-255]. In this paper we explain how our work on the structure of Keisler's order, a large-scale classification problem in model theory, led to the solution of this problem in ZFC as well as of an a priori unrelated open question in model theory.
Electron-plasmon model in the electron liquid theory
Directory of Open Access Journals (Sweden)
M.V.Vavrukh
2005-01-01
Full Text Available Here we propose an accurate approach to the description of the electron liquid model in the electron and plasmon terms. Our ideas in the present paper are close to the conception of the collective variables which was developed in the papers of Bohm and Pines. However we use another body of mathematics in the transition to the expanded space of variable particles and plasmons realized by the transition operator. It is evident that in the Random Phase Approximation (RPA, the model which consists of two interactive subsystems of electrons and plasmons is equivalent to the electron liquid model with Coulomb interaction.
DEFF Research Database (Denmark)
Plant, Peter
2012-01-01
Quality assurance and evidence in career guidance in Europe are often seen as self-evident approaches, but particular interests lie behind......Quality assurance and evidence in career guidance in Europe are often seen as self-evident approaches, but particular interests lie behind...
Theory and evidence for using the economy-of-scale law in power plant economics
International Nuclear Information System (INIS)
Phung, D.L.
1987-05-01
This report compiles theory and evidence for the use of the economy-of-scale law in energy economics, particularly in the estimation of capital costs for coal-fired and nuclear power plants. The economy-of-scale law is widely used in its simplest form: cost is directly proportional to capacity raised to an exponent. An additive constant is an important component that is not generally taken into account. Also, the economy of scale is perforce valid only over a limited size range. The majority of engineering studies have estimated an economy of scale exponent of 0.7 to 0.9 for coal-fired plants and an exponent of 0.4 to 0.6 for nuclear plants in the capacity ranges of 400 to 1000 MWe. However, the majority of econometric analyses found little or no economy of scale for coal-fired plants and only a slight economy of scale for nuclear plants. This disparity is explained by the fact that economists have included regulatory and time-related costs in addition to the direct and indirect costs used by the engineers. Regulatory and time-related costs have become an increasingly larger portion of total costs during the last decade. In addition, these costs appeared to have either a very small economy of scale or to be increasing as the size of the power plant increased. We conclude that gains in economy of scale can only be made by reducing regulatory and time-related costs through design standardization and regulatory stability, in combination with more favorable economic conditions. 59 refs
Item level diagnostics and model - data fit in item response theory ...
African Journals Online (AJOL)
Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...