Modeling Multivariate Volatility Processes: Theory and Evidence
Directory of Open Access Journals (Sweden)
Jelena Z. Minovic
2009-05-01
Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-18
Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.
Selection Bias in Educational Transition Models: Theory and Empirical Evidence
DEFF Research Database (Denmark)
Holm, Anders; Jæger, Mads
Most studies using Mare’s (1980, 1981) seminal model of educational transitions find that the effect of family background decreases across transitions. Recently, Cameron and Heckman (1998, 2001) have argued that the “waning coefficients” in the Mare model are driven by selection on unobserved...... the United States, United Kingdom, Denmark, and the Netherlands shows that when we take selection into account the effect of family background variables on educational transitions is largely constant across transitions. We also discuss several difficulties in estimating educational transition models which...... variables. This paper, first, explains theoretically how selection on unobserved variables leads to waning coefficients and, second, illustrates empirically how selection leads to biased estimates of the effect of family background on educational transitions. Our empirical analysis using data from...
Energy Technology Data Exchange (ETDEWEB)
Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)
2006-10-01
Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
A Modified Model of Failure Mode and Effects Analysis Based on Generalized Evidence Theory
Directory of Open Access Journals (Sweden)
Deyun Zhou
2016-01-01
Full Text Available Due to the incomplete knowledge, how to handle the uncertain risk factors in failure mode and effects analysis (FMEA is still an open issue. This paper proposes a new generalized evidential FMEA (GEFMEA model to handle the uncertain risk factor, which may not be included in the conventional FMEA model. In GEFMEA, not only the conventional risk factors, the occurrence, severity, and detectability of the failure mode, but also the other incomplete risk factors are taken into consideration. In addition, the relative importance among all these risk factors is well addressed in the proposed method. GEFMEA is based on the generalized evidence theory, which is efficient in handling incomplete information in the open world. The efficiency and some merit of the proposed method are verified by the numerical example and a real case study on aircraft turbine rotor blades.
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Creemers, B.P.M.; Kyriakides, L.
2010-01-01
This paper refers to a dynamic perspective of educational effectiveness and improvement stressing the importance of using an evidence-based and theory-driven approach. Specifically, an approach to school improvement based on the dynamic model of educational effectiveness is offered. The recommended
EARLY WARNING MODEL OF NETWORK INTRUSION BASED ON D-S EVIDENCE THEORY
Institute of Scientific and Technical Information of China (English)
Tian Junfeng; Zhai Jianqiang; Du Ruizhong; Huang Jiancai
2005-01-01
Application of data fusion technique in intrusion detection is the trend of nextgeneration Intrusion Detection System (IDS). In network security, adopting security early warning technique is feasible to effectively defend against attacks and attackers. To do this, correlative information provided by IDS must be gathered and the current intrusion characteristics and situation must be analyzed and estimated. This paper applies D-S evidence theory to distributed intrusion detection system for fusing information from detection centers, making clear intrusion situation, and improving the early warning capability and detection efficiency of the IDS accordingly.
Watson, Jean; Foster, Roxie
2003-05-01
This paper presents a proposed model: The Attending Nursing Caring Model (ANCM) as an exemplar for advancing and transforming nursing practice within a reflective, theoretical and evidence-based context. Watson's theory of human caring is used as a guide for integrating theory, evidence and advanced therapeutics in the area of children's pain. The ANCM is offered as a programme for renewing the profession and its professional practices of caring-healing arts and science, during an era of decline, shortages, and crises in care, safety, and hospital and health reform. The ANCM elevates contemporary nursing's caring values, relationships, therapeutics and responsibilities to a higher/deeper order of caring science and professionalism, intersecting with other professions, while sustaining the finest of its heritage and traditions of healing.
An Imprecise Probability Model for Structural Reliability Based on Evidence and Gray Theory
Directory of Open Access Journals (Sweden)
Bin Suo
2013-01-01
Full Text Available To avoid the shortages and limitations of probabilistic and non-probabilistic reliability model for structural reliability analysis in the case of limited samples for basic variables, a new imprecise probability model is proposed. Confidence interval with a given confidence is calculated on the basis of small samples by gray theory, which is not depending on the distribution pattern of variable. Then basic probability assignments and focal elements are constructed and approximation methods of structural reliability based on belief and plausibility functions are proposed in the situation that structure limit state function is monotonic and non-monotonic, respectively. The numerical examples show that the new reliability model utilizes all the information included in small samples and considers both aleatory and epistemic uncertainties in them, thus it can rationally measure the safety of the structure and the measurement can be more and more accurate with the increasing of sample size.
Segmentation of respiratory signals by evidence theory.
Belghith, Akram; Collet, Christophe
2009-01-01
This paper presents an evidential segmentation scheme of respiratory signals for the detection of the wheezing sounds. The segmentation is based on the modeling of the data by evidence theory which is well suited to represent such uncertain and imprecise data. In this paper, we particularly focus on the modelization of the data imprecision using the fuzzy theory. The modelization result is then used to define the mass function. The effectiveness of the method is demonstrated on synthetic and real signals.
GARCH Option Valuation: Theory and Evidence
DEFF Research Database (Denmark)
Christoffersen, Peter; Jacobs, Kris; Ornthanalai, Chayawat
We survey the theory and empirical evidence on GARCH option valuation models. Our treatment includes the range of functional forms available for the volatility dynamic, multifactor models, nonnormal shock distributions as well as style of pricing kernels typically used. Various strategies...
Directory of Open Access Journals (Sweden)
Dongxiao Niu
2012-01-01
Full Text Available Because clean energy and traditional energy have different advantages and disadvantages, it is of great significance to evaluate comprehensive benefits for hybrid power systems. Based on thorough analysis of important characters on hybrid power systems, an index system including security, economic benefit, environmental benefit, and social benefit is established in this paper. Due to advantages of processing abundant uncertain and fuzzy information, vague set is used to determine the decision matrix. Convert vague decision matrix to real one by vague combination ruleand determine uncertain degrees of different indexes by grey incidence analysis, then the mass functions of different comment set in different indexes are obtained. Information can be fused in accordance with Dempster-Shafer (D-S combination rule and the evaluation result is got by vague set and D-S evidence theory. A simulation of hybrid power system including thermal power, wind power, and photovoltaic power in China is provided to demonstrate the effectiveness and potential of the proposed design scheme. It can be clearly seen that the uncertainties in decision making can be dramatically decreased compared with existing methods in the literature. The actual implementation results illustrate that the proposed index system and evaluation model based on vague set and D-S evidence theory are effective and practical to evaluate comprehensive benefit of hybrid power system.
Mangani, P
2011-01-01
This title includes: Lectures - G.E. Sacks - Model theory and applications, and H.J. Keisler - Constructions in model theory; and, Seminars - M. Servi - SH formulas and generalized exponential, and J.A. Makowski - Topological model theory.
Combined trust model based on evidence theory in iterated prisoner's dilemma game
Chen, Bo; Zhang, Bin; Zhu, Weidong
2011-01-01
In the iterated prisoner's dilemma game, agents often play defection based on mutual distrust for the sake of their own benefits. However, most game strategies and mechanisms are limited for strengthening cooperative behaviour in the current literature, especially in noisy environments. In this article, we construct a combined trust model by combining the locally owned information and the recommending information from other agents and develop a combined trust strategy in the iterated prisoner's dilemma game. The proposed game strategy can provide not only a higher payoff for agents, but also a trust mechanism for the system. Furthermore, agents can form their own reputation evaluations upon their opponents and make more rational and precise decisions under our framework. Simulations of application are performed to show the performance of the proposed strategy in noise-free and noisy environments.
Theory- and evidence-based Intervention
DEFF Research Database (Denmark)
Nissen, Poul
2011-01-01
In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence-based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning which includes recognition...... of the influence of community, school, peers, famely and the functional and structural domains of personality at the behavioural, psenomenological, intra-psychic and biophysical level in a dialectical process. One important aspect of the theoretical basis for presentation of this model is that the child...
Theory- and evidence-based Intervention
DEFF Research Database (Denmark)
Nissen, Poul
2011-01-01
In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence-based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning which includes recognition...... of the influence of community, school, peers, famely and the functional and structural domains of personality at the behavioural, psenomenological, intra-psychic and biophysical level in a dialectical process. One important aspect of the theoretical basis for presentation of this model is that the child...
Bass, Randy; Linkon, Sherry Lee
2008-01-01
While some literary scholars claim that their discipline's research practices do not fit the scholarship of teaching and learning, close reading--the signature critical practice of literary studies--provides a useful model. Close reading involves not only attention to the text but also the integration of text and theory. This article analyzes how…
Belegradek, OV
1999-01-01
This volume is a collection of papers on model theory and its applications. The longest paper, "Model Theory of Unitriangular Groups" by O. V. Belegradek, forms a subtle general theory behind Mal‴tsev's famous correspondence between rings and groups. This is the first published paper on the topic. Given the present model-theoretic interest in algebraic groups, Belegradek's work is of particular interest to logicians and algebraists. The rest of the collection consists of papers on various questions of model theory, mainly on stability theory. Contributors are leading Russian researchers in the
Computational mate choice: theory and empirical evidence.
Castellano, Sergio; Cadeddu, Giorgia; Cermelli, Paolo
2012-06-01
The present review is based on the thesis that mate choice results from information-processing mechanisms governed by computational rules and that, to understand how females choose their mates, we should identify which are the sources of information and how they are used to make decisions. We describe mate choice as a three-step computational process and for each step we present theories and review empirical evidence. The first step is a perceptual process. It describes the acquisition of evidence, that is, how females use multiple cues and signals to assign an attractiveness value to prospective mates (the preference function hypothesis). The second step is a decisional process. It describes the construction of the decision variable (DV), which integrates evidence (private information by direct assessment), priors (public information), and value (perceived utility) of prospective mates into a quantity that is used by a decision rule (DR) to produce a choice. We make the assumption that females are optimal Bayesian decision makers and we derive a formal model of DV that can explain the effects of preference functions, mate copying, social context, and females' state and condition on the patterns of mate choice. The third step of mating decision is a deliberative process that depends on the DRs. We identify two main categories of DRs (absolute and comparative rules), and review the normative models of mate sampling tactics associated to them. We highlight the limits of the normative approach and present a class of computational models (sequential-sampling models) that are based on the assumption that DVs accumulate noisy evidence over time until a decision threshold is reached. These models force us to rethink the dichotomy between comparative and absolute decision rules, between discrimination and recognition, and even between rational and irrational choice. Since they have a robust biological basis, we think they may represent a useful theoretical tool for
Critical evidence for the prediction error theory in associative learning.
Terao, Kanta; Matsumoto, Yukihisa; Mizunami, Makoto
2015-03-10
In associative learning in mammals, it is widely accepted that the discrepancy, or error, between actual and predicted reward determines whether learning occurs. Complete evidence for the prediction error theory, however, has not been obtained in any learning systems: Prediction error theory stems from the finding of a blocking phenomenon, but blocking can also be accounted for by other theories, such as the attentional theory. We demonstrated blocking in classical conditioning in crickets and obtained evidence to reject the attentional theory. To obtain further evidence supporting the prediction error theory and rejecting alternative theories, we constructed a neural model to match the prediction error theory, by modifying our previous model of learning in crickets, and we tested a prediction from the model: the model predicts that pharmacological intervention of octopaminergic transmission during appetitive conditioning impairs learning but not formation of reward prediction itself, and it thus predicts no learning in subsequent training. We observed such an "auto-blocking", which could be accounted for by the prediction error theory but not by other competitive theories to account for blocking. This study unambiguously demonstrates validity of the prediction error theory in associative learning.
Prest, M
1988-01-01
In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module
Bayesian Evidence and Model Selection
Knuth, Kevin H; Malakar, Nabin K; Mubeen, Asim M; Placek, Ben
2014-01-01
In this paper we review the concept of the Bayesian evidence and its application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. Application to several practical examples within the context of signal processing are discussed.
Theory Modeling and Simulation
Energy Technology Data Exchange (ETDEWEB)
Shlachter, Jack [Los Alamos National Laboratory
2012-08-23
Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.
Wage Dynamics: Reconciling Theory and Evidence
Olivier Jean Blanchard; Lawrence Katz
1999-01-01
U.S. macroeconomic evidence shows a negative relation between the rate of change of wages and unemployment. In contrast, most theories of wage determination imply a negative relation between the level of wages and unemployment. In this paper, we ask whether one can reconcile the empirical evidence with theoretical wage relations. We reach three main conclusions. First, we derive the condition under which the two can indeed be reconciled. We show the constraints that such a condition imposes o...
Some empirical evidence for ecological dissonance theory.
Miller, D I; Verhoek-Miller, N; Giesen, J M; Wells-Parker, E
2000-04-01
Using Festinger's cognitive dissonance theory as a model, the extension to Barker's ecological theory, referred to as ecological dissonance theory, was developed. Designed to examine the motivational dynamics involved when environmental systems are in conflict with each other or with cognitive systems, ecological dissonance theory yielded five propositions which were tested in 10 studies. This summary of the studies suggests operationally defined measures of ecological dissonance may correlate with workers' satisfaction with their jobs, involvement with their jobs, alienation from their work, and to a lesser extent, workers' conflict resolution behavior and communication style.
Anomalous Evidence, Confidence Change, and Theory Change.
Hemmerich, Joshua A; Van Voorhis, Kellie; Wiley, Jennifer
2016-08-01
A novel experimental paradigm that measured theory change and confidence in participants' theories was used in three experiments to test the effects of anomalous evidence. Experiment 1 varied the amount of anomalous evidence to see if "dose size" made incremental changes in confidence toward theory change. Experiment 2 varied whether anomalous evidence was convergent (of multiple types) or replicating (similar finding repeated). Experiment 3 varied whether participants were provided with an alternative theory that explained the anomalous evidence. All experiments showed that participants' confidence changes were commensurate with the amount of anomalous evidence presented, and that larger decreases in confidence predicted theory changes. Convergent evidence and the presentation of an alternative theory led to larger confidence change. Convergent evidence also caused more theory changes. Even when people do not change theories, factors pertinent to the evidence and alternative theories decrease their confidence in their current theory and move them incrementally closer to theory change.
Demographic evidence for adaptive theories of aging.
Mitteldorf, J J
2012-07-01
Pleiotropic theories for the evolutionary origins of senescence have been ascendant for forty years (see, for example, G. Williams (1957) Evolution, 11, 398-411; T. Kirkwood (1977) Nature, 270, 301-304), and it is not surprising that interpreters of demographic data seek to frame their results in this context. But some of that evidence finds a much more natural explanation in terms of adaptive aging. Here we re-interpret the 1997 results of the Centenarian Study in Boston, which found in their sample of centenarian women an excess of late childbearing. The finding was originally interpreted as a selection effect: a metabolic link between late menopause and longevity. But we demonstrate that this interpretation is statistically strained, and that the data in fact indicate a causal link: bearing a child late in life induces a metabolic response that promotes longevity. This conclusion directly contradicts some pleiotropic theories of aging that postulate a "cost of reproduction", and it supports theories of aging as an adaptive genetic program.
Noncommutative Gauge Theories: Model for Hodge theory
Upadhyay, Sudhaker
2013-01-01
The nilpotent BRST, anti-BRST, dual-BRST and anti-dual-BRST symmetry transformations are constructed in the context of noncommutative (NC) 1-form as well as 2-form gauge theories. The corresponding Noether's charges for these symmetries on the Moyal plane are shown to satisfy the same algebra as by the de Rham cohomological operators of differential geometry. The Hodge decomposition theorem on compact manifold is also studied. We show that noncommutative gauge theories are field theoretic models for Hodge theory.
Assistive Technology in Australia: Integrating theory and evidence into action.
Steel, Emily J; Layton, Natasha A
2016-12-01
Occupational therapists use a range of strategies to influence the relationship between person, environment and occupation and facilitate people's participation and inclusion in society. Technology is a fundamental environmental factor capable of enabling inclusion, and occupational therapy models articulate a role for assistive technology (AT) devices and services, but there is a gap between theory, research and practice. The context of AT provision in Australia presents systemic barriers that prevent optimal application of AT devices and services for societal health promotion and in individualised solutions. The Integrating Theory, Evidence and Action method (ITEA) was used to answer the question 'How can occupational therapy support AT provision to enable older people and people with disability?' A wide range of sources were systematically analysed to explore the complexities of AT provision in Australia. The International Classification of Functioning, Disability and Health (ICF) and IMPACT(2) model are used as frameworks to reconstruct evidence into statements that summarise the theory, process and outcomes of AT provision. Analysis of the influence of the global disability rights and local policies and AT provision systems is used to highlight important aspects for occupational therapists to consider in research and practice. Pragmatic recommendations are provided to enable practitioners to translate theory and evidence into action. AT provision can be improved by focusing on evidence for and congruence between theory, process and outcomes, rather than isolated interventions. Occupational therapists should consider the influence of contextual factors on practice, and work with consumers to improve access and equity in AT provision systems. © 2016 Occupational Therapy Australia.
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
Jonas Olson's Evidence for Moral Error Theory
Evers, Daan
2016-01-01
Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although
Jonas Olson's Evidence for Moral Error Theory
Evers, Daan
2016-01-01
Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although
Alternative banking: theory and evidence from Europe
Directory of Open Access Journals (Sweden)
Kurt Von Mettenheim
2012-12-01
Full Text Available Since financial liberalization in the 1980s, non-profit maximizing, stakeholder-oriented banks have outperformed private banks in Europe. This article draws on empirical research, banking theory and theories of the firm to explain this apparent anomaly for neo-liberal policy and contemporary market-based banking theory. The realization of competitive advantages by alternative banks (savings banks, cooperative banks and development banks has significant implications for conceptions of bank change, regulation and political economy.
Modification of evidence theory based on feature extraction
Institute of Scientific and Technical Information of China (English)
DU Feng; SHI Wen-kang; DENG Yong
2005-01-01
Although evidence theory has been widely used in information fusion due to its effectiveness of uncertainty reasoning, the classical DS evidence theory involves counter-intuitive behaviors when high conflict information exists. Many modification methods have been developed which can be classified into the following two kinds of ideas, either modifying the combination rules or modifying the evidence sources. In order to make the modification more reasonable and more effective, this paper gives a thorough analysis of some typical existing modification methods firstly, and then extracts the intrinsic feature of the evidence sources by using evidence distance theory. Based on the extracted features, two modified plans of evidence theory according to the corresponding modification ideas have been proposed. The results of numerical examples prove the good performance of the plans when combining evidence sources with high conflict information.
Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory
Directory of Open Access Journals (Sweden)
Yafei Song
2015-01-01
Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.
Evaluation Theory, Models, and Applications
Stufflebeam, Daniel L.; Shinkfield, Anthony J.
2007-01-01
"Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing…
Aligning Grammatical Theories and Language Processing Models
Lewis, Shevaun; Phillips, Colin
2015-01-01
We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…
Dekorvin, Andre
1992-01-01
The Dempster-Shafer theory of evidence is applied to a multiattribute decision making problem whereby the decision maker (DM) must compromise with available alternatives, none of which exactly satisfies his ideal. The decision mechanism is constrained by the uncertainty inherent in the determination of the relative importance of each attribute element and the classification of existing alternatives. The classification of alternatives is addressed through expert evaluation of the degree to which each element is contained in each available alternative. The relative importance of each attribute element is determined through pairwise comparisons of the elements by the decision maker and implementation of a ratio scale quantification method. Then the 'belief' and 'plausibility' that an alternative will satisfy the decision maker's ideal are calculated and combined to rank order the available alternatives. Application to the problem of selecting computer software is given.
New weighting factors assignment of evidence theory based on evidence distance
Institute of Scientific and Technical Information of China (English)
Chen Liangzhou; Shi Wenkang; Du Feng
2005-01-01
Evidence theory has been widely used in the information fusion for its effectiveness of the uncertainty reasoning. However, the classical DS evidence theory involves counter-intuitive behaviors when the high conflict information exists. Based on the analysis of some modified methods, Assigning the weighting factors according to the intrinsic characteristics of the existing evidence sources is proposed, which is determined on the evidence distance theory. From the numerical examples, the proposed method provides a reasonable result with good convergence efficiency. In addition, the new rule retrieves to the Yager's formula when all the evidence sources contradict to each other completely.
Biological evidence against the panspermia theory
2010-01-01
Abstract The following idea is analysed. Given that evolution on Earth seems to have passed through protocellular evolutionary stages of progenotes, this would appear to be incompatible with the panspermia theory because this observation would imply that the infection bringing life to the Earth started in these protocells, for which a low or null infective power is generally expected. correspondence: Tel.: +390816132369; fax: +390816132706. ...
Biological evidence against the panspermia theory.
Di Giulio, Massimo
2010-10-21
The following idea is analysed. Given that evolution on Earth seems to have passed through protocellular evolutionary stages of progenotes, this would appear to be incompatible with the panspermia theory because this observation would imply that the infection bringing life to the Earth started in these protocells, for which a low or null infective power is generally expected.
An approximation approach for uncertainty quantification using evidence theory
Energy Technology Data Exchange (ETDEWEB)
Bae, Ha-Rok; Grandhi, Ramana V.; Canfield, Robert A
2004-12-01
Over the last two decades, uncertainty quantification (UQ) in engineering systems has been performed by the popular framework of probability theory. However, many scientific and engineering communities realize that there are limitations in using only one framework for quantifying the uncertainty experienced in engineering applications. Recently evidence theory, also called Dempster-Shafer theory, was proposed to handle limited and imprecise data situations as an alternative to the classical probability theory. Adaptation of this theory for large-scale engineering structures is a challenge due to implicit nature of simulations and excessive computational costs. In this work, an approximation approach is developed to improve the practical utility of evidence theory in UQ analysis. The techniques are demonstrated on composite material structures and airframe wing aeroelastic design problem.
Children Balance Theories and Evidence in Exploration, Explanation, and Learning
Bonawitz, Elizabeth Baraff; van Schijndel, Tessa J. P.; Friel, Daniel; Schulz, Laura
2012-01-01
We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children, mean: 62 months). Center and Mass Theory children who…
Model Theory in Algebra, Analysis and Arithmetic
Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J
2014-01-01
Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.
Universal moral grammar: theory, evidence and the future.
Mikhail, John
2007-04-01
Scientists from various disciplines have begun to focus attention on the psychology and biology of human morality. One research program that has recently gained attention is universal moral grammar (UMG). UMG seeks to describe the nature and origin of moral knowledge by using concepts and models similar to those used in Chomsky's program in linguistics. This approach is thought to provide a fruitful perspective from which to investigate moral competence from computational, ontogenetic, behavioral, physiological and phylogenetic perspectives. In this article, I outline a framework for UMG and describe some of the evidence that supports it. I also propose a novel computational analysis of moral intuitions and argue that future research on this topic should draw more directly on legal theory.
Inequality, redistribution and growth : Theory and evidence
Haile, D.
2005-01-01
From a macro-perspective, the thesis provides a political economic model that analyses the joint determination of inequality, corruption, taxation, education and economic growth in a dynamic environment. It demonstrates how redistributive taxation is affected by the distribution of wealth and limite
The Monitor Model: More Evidence.
Ciske, Mary Desjarlais
1983-01-01
The use of prepositions, past tense verb forms, and subject-verb agreement in the English of a Korean college student of English for foreign students is analyzed in the context of Krashen's Monitor Model theory of language acquisition. The student was interviewed, asked to give a free writing sample, asked to read his own writing, and asked to…
Evidence Combination From an Evolutionary Game Theory Perspective.
Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu
2016-09-01
Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.
Blue Ocean versus Competitive Strategy: Theory and Evidence
Burke, Andrew; Stel, André; Thurik, Roy
2009-01-01
textabstractBlue ocean strategy seeks to turn strategic management on its head by replacing ‘competitive advantage’ with ‘value innovation’ as the primary goal where firms must create consumer demand and exploit untapped markets. Empirical analysis has been focused on case study evidence and so lacks generality to resolve the debate. We provide a methodological synthesis of the theories enabling us to bring statistical evidence to the debate. Our analysis finds that blue ocean and competitive...
Constructing a New Theory from Old Ideas and New Evidence
Rhodes, Marjorie; Wellman, Henry
2013-01-01
A central tenet of constructivist models of conceptual development is that children's initial conceptual level constrains how they make sense of new evidence and thus whether exposure to evidence will prompt conceptual change. Yet little experimental evidence directly examines this claim for the case of sustained, fundamental conceptual…
Stochastic Climate Theory and Modelling
Franzke, Christian L E; Berner, Judith; Williams, Paul D; Lucarini, Valerio
2014-01-01
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations as well as for model error representation, uncertainty quantification, data assimilation and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochast...
Model companions of theories with an automorphism
Kikyo, Hirotaka
1998-01-01
For a theory $T$ in $L, T_\\sigma$ is the theory of the models of $T$ with an automorphism $\\sigma$. If $T$ is an unstable model complete theory without the independence property, then $T_\\sigma$ has no model companion. If $T$ is an unstable model complete theory and $T_\\sigma$ has the amalgamation property, then $T_\\sigma$ has no model companion. If $T$ is model complete and has the fcp, then $T_\\sigma$ has no model completion.
TIM Series: Theory, Evidence and the Pragmatic Manager
Directory of Open Access Journals (Sweden)
Steven Muegge
2008-08-01
Full Text Available On July 2, 2008, Steven Muegge from Carleton University delivered a presentation entitled "Theory, Evidence and the Pragmatic Manager". This section provides the key messages from the lecture. The scope of this lecture spanned several topics, including management decision making, forecasting and its limitations, the psychology of expertise, and the management of innovation.
Children balance theories and evidence in exploration, explanation, and learning
Bonawitz, E.B.; van Schijndel, T.J.P.; Friel, D.; Schulz, L.
2012-01-01
We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children,
Short-term wind speed forecasting model based on D-S evidence theory%基于D-S证据理论的短期风速预测模型
Institute of Scientific and Technical Information of China (English)
刘亚南; 卫志农; 朱艳; 孙国强; 孙永辉; 杨友情; 钱瑛; 周军
2013-01-01
提出一种基于D-S证据理论的短期风速组合预测模型.分别采用时间序列、BP神经网络和支持向量机预测模型对风速进行预测,通过对预测误差的分析,借助D-S证据理论对3种模型进行融合.选取待测日前凡日的风速数据作为融合样本,计算出相应的基本信任分配函数,同时将函数进行融合,并将融合结果作为风速预测模型的权重,得到待预测日的风速预测结果.仿真结果表明,所提组合预测模型的预测误差更小,效果更好.%A combined short-term wind speed forecasting model based on D-S evidence theory is proposed.The forecasting models of time series,BP neural network and support vector machine are adopted to respectively forecast the wind speed.Based on the analysis of forecast errors,D-S evidence theory is applied to fuse these three models.The wind speed data for several days before are taken as the fusion samples to calculate the corresponding basic trust distribution functions,which are then fused.The results of fusion are taken as the weights of the wind speed forecasting model and the wind speed of the day to be forecasted is calculated.Simulative results show that,the proposed combined forecasting model has smaller forecasting error and better effect.
What (or who) causes health inequalities: theories, evidence and implications?
McCartney, Gerry; Collins, Chik; Mackenzie, Mhairi
2013-12-01
Health inequalities are the unjust differences in health between groups of people occupying different positions in society. Since the Black Report of 1980 there has been considerable effort to understand what causes them, so as to be able to identify actions to reduce them. This paper revisits and updates the proposed theories, evaluates the evidence in light of subsequent epidemiological research, and underlines the political and policy ramifications. The Black Report suggested four theories (artefact, selection, behavioural/cultural and structural) as to the root causes of health inequalities and suggested that structural theory provided the best explanation. These theories have since been elaborated to include intelligence and meritocracy as part of selection theory. However, the epidemiological evidence relating to the proposed causal pathways does not support these newer elaborations. They may provide partial explanations or insights into the mechanisms between cause and effect, but structural theory remains the best explanation as to the fundamental causes of health inequalities. The paper draws out the vitally important political and policy implications of this assessment. Health inequalities cannot be expected to reduce substantially as a result of policy aimed at changing health behaviours, particularly in the face of wider public policy that militates against reducing underlying social inequalities. Furthermore, political rhetoric about the need for 'cultural change', without the required changes in the distribution of power, income, wealth, or in the regulatory frameworks in society, is likely to divert from necessary action.
Constructing a new theory from old ideas and new evidence
2013-01-01
A central tenet of constructivist models of conceptual development is that children’s initial conceptual level constrains how they make sense of new evidence and thus whether exposure to evidence will prompt conceptual change. Yet, little experimental evidence directly examines this claim for the case of sustained, fundamental conceptual achievements. The present study combined scaling and experimental microgenetic methods to examine the processes underlying conceptual change in the context o...
Dark energy observational evidence and theoretical models
Novosyadlyj, B; Shtanov, Yu; Zhuk, A
2013-01-01
The book elucidates the current state of the dark energy problem and presents the results of the authors, who work in this area. It describes the observational evidence for the existence of dark energy, the methods and results of constraining of its parameters, modeling of dark energy by scalar fields, the space-times with extra spatial dimensions, especially Kaluza---Klein models, the braneworld models with a single extra dimension as well as the problems of positive definition of gravitational energy in General Relativity, energy conditions and consequences of their violation in the presence of dark energy. This monograph is intended for science professionals, educators and graduate students, specializing in general relativity, cosmology, field theory and particle physics.
Models in theory building: the case of early string theory
Energy Technology Data Exchange (ETDEWEB)
Castellani, Elena [Department of Philosophy, Florence (Italy)
2013-07-01
The history of the origins and first steps of string theory, from Veneziano's formulation of his famous scattering amplitude in 1968 to the 'first string revolution' in 1984, provides rich material for discussing traditional issues in the philosophy of science. This paper focusses on the initial phase of this history, that is the making of early string theory out of the 'dual theory of strong interactions' motivated by the aim of finding a viable theory of hadrons in the framework of the so-called S-matrix theory of the Sixties: from the first two models proposed (the Dual Resonance Model and the Shapiro-Virasoro Model) to all the subsequent endeavours to extend and complete the theory, including its string interpretation. As is the aim of this paper to show, by representing an exemplary illustration of the building of a scientific theory out of tentative and partial models this is a particularly fruitful case study for the current philosophical discussion on how to characterize a scientific model, a scientific theory, and the relation between models and theories.
Voter Turnout in Direct Democracy: Theory and Evidence
Søberg, Morten; Thomas P. Tangerås
2003-01-01
We analyse voter turnout as a function of referendum types. An advisory referendum produces advice that a legislature may or may not take into account when choosing between two alternatives, whereas a binding referendum generates a decisive decision. In theory, voter turnout should be higher under binding than advisory referendums, higher in small than large electorates and higher in close than less close referendums. These predictions are corroborated by evidence from 230 local referendums i...
Models in cooperative game theory
Branzei, Rodica; Tijs, Stef
2008-01-01
This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.
Short-run Exchange-Rate Dynamics: Theory and Evidence
DEFF Research Database (Denmark)
Carlson, John A.; Dahl, Christian Møller; Osler, Carol L.
of currency markets, it accurately reflects the constraints and objectives faced by the major participants, and it fits key stylized facts concerning returns and order flow. With respect to macroeconomics, the model is consistent with most of the major puzzles that have emerged under floating rates.......Recent research has revealed a wealth of information about the microeconomics of currency markets and thus the determination of exchange rates at short horizons. This information is valuable to us as scientists since, like evidence of macroeconomic regularities, it can provide critical guidance...... for designing exchange-rate models. This paper presents an optimizing model of short-run exchange-rate dynamics consistent with both the micro evidence and the macro evidence, the first such model of which we are aware. With respect to microeconomics, the model is consistent with the institutional structure...
Theory of Self- vs. Externally-Regulated LearningTM: Fundamentals, Evidence, and Applicability
Directory of Open Access Journals (Sweden)
Jesús de la Fuente-Arias
2017-09-01
Full Text Available The Theory of Self- vs. Externally-Regulated LearningTM has integrated the variables of SRL theory, the DEDEPRO model, and the 3P model. This new Theory has proposed: (a in general, the importance of the cyclical model of individual self-regulation (SR and of external regulation stemming from the context (ER, as two different and complementary variables, both in combination and in interaction; (b specifically, in the teaching-learning context, the relevance of different types of combinations between levels of self-regulation (SR and of external regulation (ER in the prediction of self-regulated learning (SRL, and of cognitive-emotional achievement. This review analyzes the assumptions, conceptual elements, empirical evidence, benefits and limitations of SRL vs. ERL Theory. Finally, professional fields of application and future lines of research are suggested.
Stochastic models: theory and simulation.
Energy Technology Data Exchange (ETDEWEB)
Field, Richard V., Jr.
2008-03-01
Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.
Qigong in Cancer Care: Theory, Evidence-Base, and Practice
Directory of Open Access Journals (Sweden)
Penelope Klein
2017-01-01
Full Text Available Background: The purpose of this discussion is to explore the theory, evidence base, and practice of Qigong for individuals with cancer. Questions addressed are: What is qigong? How does it work? What evidence exists supporting its practice in integrative oncology? What barriers to wide-spread programming access exist? Methods: Sources for this discussion include a review of scholarly texts, the Internet, PubMed, field observations, and expert opinion. Results: Qigong is a gentle, mind/body exercise integral within Chinese medicine. Theoretical foundations include Chinese medicine energy theory, psychoneuroimmunology, the relaxation response, the meditation effect, and epigenetics. Research supports positive effects on quality of life (QOL, fatigue, immune function and cortisol levels, and cognition for individuals with cancer. There is indirect, scientific evidence suggesting that qigong practice may positively influence cancer prevention and survival. No one Qigong exercise regimen has been established as superior. Effective protocols do have common elements: slow mindful exercise, easy to learn, breath regulation, meditation, emphasis on relaxation, and energy cultivation including mental intent and self-massage. Conclusions: Regular practice of Qigong exercise therapy has the potential to improve cancer-related QOL and is indirectly linked to cancer prevention and survival. Wide-spread access to quality Qigong in cancer care programming may be challenged by the availability of existing programming and work force capacity.
Institute of Scientific and Technical Information of China (English)
石彪; 刘利枚; 周鲜成
2009-01-01
在分析超声波传感器的测量特性和信息不确定性的基础上,提出了一种新的超声波传感器模型,该模型利用D-S证据理论来计算超声波传感器扇形范围之内的栅格信息的可信度,能准确描述出超声波传感器测量范围之内的障碍物位置.通过仿真验证了该方法的有效性和实用性.%Based on the analysis of the measurement characteristics and uncertainty of ultrasonic sensor, a novel ultrasonic sensor model is introduced. In this model, the reliability of grid information in the sector region of ultrasonic sensor can be calculated using the D-S evidence theory, the obstacle position can be detected in the range of ultrasonic sensor. The simulation results indicate that the model is efficiency and applicability.
Quiver gauge theories and integrable lattice models
Yagi, Junya
2015-01-01
We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d $\\mathcal{N} = 1$ theories known as brane box and brane tilling models, 3d $\\mathcal{N} = 2$ and 2d $\\mathcal{N} = (2,2)$ theories obtained from them by compactification, and 2d $\\mathcal{N} = (0,2)$ theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.
Quiver gauge theories and integrable lattice models
Energy Technology Data Exchange (ETDEWEB)
Yagi, Junya [International School for Advanced Studies (SISSA),via Bonomea 265, 34136 Trieste (Italy); INFN - Sezione di Trieste,via Valerio 2, 34149 Trieste (Italy)
2015-10-09
We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.
Enterprise Modelling supported by Manufacturing Systems Theory
MYKLEBUST, Odd
2002-01-01
There exist today a large number of enterprise models or enterprise modelling approaches. In a study of standards and project developed models there are two approaches: CIMOSA “The Open Systems Architecture for CIM” and GERAM, “Generalised Enterprise Reference Architecture”, which show a system orientation that can be further followed as interesting research topics for a system theory oriented approach for enterprise models. In the selection of system theories, manufacturing system theory...
Evidence accumulation as a model for lexical selection.
Anders, R; Riès, S; van Maanen, L; Alario, F X
2015-11-01
We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.
On Dimer Models and Closed String Theories
Sarkar, Tapobrata
2007-01-01
We study some aspects of the recently discovered connection between dimer models and D-brane gauge theories. We argue that dimer models are also naturally related to closed string theories on non compact orbifolds of $\\BC^2$ and $\\BC^3$, via their twisted sector R charges, and show that perfect matchings in dimer models correspond to twisted sector states in the closed string theory. We also use this formalism to study the combinatorics of some unstable orbifolds of $\\BC^2$.
New Pathways between Group Theory and Model Theory
Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz
2017-01-01
This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...
Applications of model theory to functional analysis
Iovino, Jose
2014-01-01
During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the
Domain Theory, Its Models and Concepts
DEFF Research Database (Denmark)
Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt
2014-01-01
Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contributions...... and industrial applications especially for the DFX areas (not reported here) and for product modelling. The theory therefore contains a rich ontology of interrelated concepts. The Domain Theory is not aiming to create normative methods but the creation of a collection of concepts related to design phenomena......, which can support design work and to form elements of designers’ mindsets and thereby their practice. The theory is a model-based theory, which means it is composed of concepts and models, which explains certain design phenomena. Many similar theories are described in the literature with differences...
Quantum field theory competitive models
Tolksdorf, Jürgen; Zeidler, Eberhard
2009-01-01
For more than 70 years, quantum field theory (QFT) can be seen as a driving force in the development of theoretical physics. Equally fascinating is the fruitful impact which QFT had in rather remote areas of mathematics. The present book features some of the different approaches, different physically viewpoints and techniques used to make the notion of quantum field theory more precise. For example, the present book contains a discussion including general considerations, stochastic methods, deformation theory and the holographic AdS/CFT correspondence. It also contains a discussion of more recent developments like the use of category theory and topos theoretic methods to describe QFT. The present volume emerged from the 3rd 'Blaubeuren Workshop: Recent Developments in Quantum Field Theory', held in July 2007 at the Max Planck Institute of Mathematics in the Sciences in Leipzig/Germany. All of the contributions are committed to the idea of this workshop series: 'To bring together outstanding experts working in...
Global Sourcing of Heterogeneous Firms: Theory and Evidence
DEFF Research Database (Denmark)
Kohler, Wilhelm; Smolka, Marcel
), as well as the location of intermediate input production (offshore vs. domestic). Unlike previous work, we allow for a fully flexible productivity effect with varying magnitude and sign across different industries. Our estimation strategy is motivated by the canonical economic model of sourcing due...... to Antràs & Helpman (2004). This model invokes the property rights theory of the firm in order to pin down firm boundaries as the outcome of an interaction between firm heterogeneity and the industry's sourcing intensity (i.e. the importance of inputs sourced from suppliers relative to headquarter inputs...
Evidence accumulation as a model for lexical selection
Anders, R.; Riès, S.; van Maanen, L.; Alario, F.-X.
2015-01-01
We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of
Theories, Models and Methodology in Writing Research
Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel
1996-01-01
Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the w
The Friction Theory for Viscosity Modeling
DEFF Research Database (Denmark)
Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan
2001-01-01
In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet......, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures...
Theory and model use in social marketing health interventions.
Luca, Nadina Raluca; Suggs, L Suzanne
2013-01-01
The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.
Indian Academy of Sciences (India)
Hesheng Tang; Yu Su; Jiao Wang
2015-08-01
The paper describes a procedure for the uncertainty quantification (UQ) using evidence theory in buckling analysis of semi-rigid jointed frame structures under mixed epistemic–aleatory uncertainty. The design uncertainties (geometrical, material, strength, and manufacturing) are often prevalent in engineering applications. Due to lack of knowledge or incomplete, inaccurate, unclear information in the modeling, simulation, measurement, and design, there are limitations in using only one framework (probability theory) to quantify uncertainty in a system because of the impreciseness of data or knowledge. Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. Unfortunately, propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than propagation of a probabilistic representation for uncertainty. In order to alleviate the computational difficulties in the evidence theory based UQ analysis, a differential evolution-based computational strategy for propagation of epistemic uncertainty in a system with evidence theory is presented here. A UQ analysis for the buckling load of steel-plane frames with semi-rigid connections is given herein to demonstrate accuracy and efficiency of the proposed method.
Sentiment Prediction Based on Dempster-Shafer Theory of Evidence
Directory of Open Access Journals (Sweden)
Mohammad Ehsan Basiri
2014-01-01
Full Text Available Sentiment prediction techniques are often used to assign numerical scores to free-text format reviews written by people in online review websites. In order to exploit the fine-grained structural information of textual content, a review may be considered as a collection of sentences, each with its own sentiment orientation and score. In this manner, a score aggregation method is needed to combine sentence-level scores into an overall review rating. While recent work has concentrated on designing effective sentence-level prediction methods, there remains the problem of finding efficient algorithms for score aggregation. In this study, we investigate different aggregation methods, as well as the cases in which they perform poorly. According to the analysis of existing methods, we propose a new score aggregation method based on the Dempster-Shafer theory of evidence. In the proposed method, we first detect the polarity of reviews using a machine learning approach and then, consider sentence scores as evidence for the overall review rating. The results from two public social web datasets show the higher performance of our method in comparison with existing score aggregation methods and state-of-the-art machine learning approaches.
Mean field theory, topological field theory, and multi-matrix models
Energy Technology Data Exchange (ETDEWEB)
Dijkgraaf, R. (Princeton Univ., NJ (USA). Joseph Henry Labs.); Witten, E. (Institute for Advanced Study, Princeton, NJ (USA). School of Natural Sciences)
1990-10-08
We show that the genus zero correlation functions of an arbitrary topological field theory coupled to two-dimensional topological gravity are determined by an appropriate Landau-Ginzburg potential. We determine the potentials that arise for topological sigma models with CP{sup 1} or a Calabi-Yau manifold for target space. We present substantial evidence that the multi-matrix models that have been studied recently are equivalent to certain topological field theories coupled to topological gravity. We also describe a topological version of the general 'string equation'. (orig.).
Mean field theory, topological field theory, and multi-matrix models
Dijkgraaf, Robbert; Witten, Edward
1990-10-01
We show that the genus zero correlation functions of an arbitrary topological field theory coupled to two-dimensional topological gravity are determined by an appropriate Landau-Ginzburg potential. We determine the potentials that arise for topological sigma models with CP 1 or a Calabi-Yau manifold for target space. We present substantial evidence that the multi-matrix models that have been studied recently are equivalent to certain topological field theories coupled to topological gravity. We also describe a topological version of the general "string equation".
Connective tissue manipulation: a review of theory and clinical evidence.
Holey, Liz A; Dixon, John
2014-01-01
Connective tissue manipulation or connective tissue massage (bindegewebsmassage) is a manual reflex therapy in that it is applied with the therapist's hands which are in contact with the patient's skin. The assessment of the patient and the clinical decision-making that directs treatment is based on a theoretical model that assumes a reflex effect on the autonomic nervous system which is induced by manipulating the fascial layers within and beneath the skin to stimulate cutaneo-visceral reflexes. This paper reviews the literature and current research findings to establish the theoretical framework for CTM and the evidence for its clinical effects. The rationale for the principles of treatment are discussed and the evidence for the clinical effectiveness assessed through an analytical review of the clinical research.
The examination of signaling theory versus pecking order theory: Evidence from Tehran Stock Exchange
Directory of Open Access Journals (Sweden)
Elahe Mahdavi Sabet
2013-01-01
Full Text Available This study investigates the explanatory power of leverage and cash flows in future cash flow prediction in Tehran Stock Exchange by considering Signaling Theory and Pecking Order Theory. Based on theoretical foundations, the regression models of leverage and cash flow with a set of control variables was developed. Statistical samples consist of companies listed in Tehran Stock Exchange over the period 2005- 2011. The results show that there was a negative relationship between cash flow and leverage levels in contemporary time. This is consistent with pecking order behavior. While at intertemporer level, there was a positive relationship between current leverage and the firm's cash flows in the future. This is consistent with signaling theory.
Application of arrangement theory to unfolding models
Kamiya, Hidehiko; Tokushige, Norihide
2010-01-01
Arrangement theory plays an essential role in the study of the unfolding model used in many fields. This paper describes how arrangement theory can be usefully employed in solving the problems of counting (i) the number of admissible rankings in an unfolding model and (ii) the number of ranking patterns generated by unfolding models. The paper is mostly expository but also contains some new results such as simple upper and lower bounds for the number of ranking patterns in the unidimensional case.
Institute of Scientific and Technical Information of China (English)
杨娟; 邱江; 张庆林
2006-01-01
目的:介绍条件推理领域心理模型理论与概率理论以及它们之间的争论与融合.资料来源:应用计算机检索外文期刊EBSCOhost数据库中的AcademicSource Premier数据库2000/2005期间的文章,检索语种为Ennglish,检索词为"mental model,probabilistictheory".资料选择:对资料进行初审,选择全面介绍模型理论和概率理论的文献查找全文.特别是提到了它们之间的争论与融合的文献.资料提炼:共收集到相关文献75篇,按上述标准纳入5篇,在此基础上,继续搜索已有文献的参考文献,最后得到有关文献11篇.资料综合:11篇文献中有2篇文献详细介绍了条件推理中的心理模型理论与概率理论,有2篇文献介绍了他们的争论与分歧,有1篇文献介绍了在工作记忆上两者的融合.结论:虽然两种理论的支持者都对条件推理的心理机制有不同的看法,但是近来,研究者把工作记忆和条件推理结合起来,发现工作记忆容量大的被试的推理过程比较符合模型理论的预测,而工作记忆容量小的被试的推理过程比较符合概率理论的预测.换句话说模型理论和概率理论是似乎是可以融合的.
Network Attack Classification and Recognition Using HMM and Improved Evidence Theory
Directory of Open Access Journals (Sweden)
Gang Luo
2016-04-01
Full Text Available In this paper, a decision model of fusion classification based on HMM-DS is proposed, and the training and recognition methods of the model are given. As the pure HMM classifier can’t have an ideal balance between each model with a strong ability to identify its target and the maximum difference between models. So in this paper, the results of HMM are integrated into the DS framework, and HMM provides state probabilities for DS. The output of each hidden Markov model is used as a body of evidence. The improved evidence theory method is proposed to fuse the results and encounter drawbacks of the pure HMM for improving classification accuracy of the system. We compare our approach with the traditional evidence theory method, other representative improved DS methods, pure HMM method and common classification methods. The experimental results show that our proposed method has a significant practical effect in improving the training process of network attack classification with high accuracy.
Scientific Theories, Models and the Semantic Approach
Directory of Open Access Journals (Sweden)
Décio Krause
2007-12-01
Full Text Available According to the semantic view, a theory is characterized by a class of models. In this paper, we examine critically some of the assumptions that underlie this approach. First, we recall that models are models of something. Thus we cannot leave completely aside the axiomatization of the theories under consideration, nor can we ignore the metamathematics used to elaborate these models, for changes in the metamathematics often impose restrictions on the resulting models. Second, based on a parallel between van Fraassen’s modal interpretation of quantum mechanics and Skolem’s relativism regarding set-theoretic concepts, we introduce a distinction between relative and absolute concepts in the context of the models of a scientific theory. And we discuss the significance of that distinction. Finally, by focusing on contemporary particle physics, we raise the question: since there is no general accepted unification of the parts of the standard model (namely, QED and QCD, we have no theory, in the usual sense of the term. This poses a difficulty: if there is no theory, how can we speak of its models? What are the latter models of? We conclude by noting that it is unclear that the semantic view can be applied to contemporary physical theories.
Institute of Scientific and Technical Information of China (English)
毛松平; 邹祖绪
2014-01-01
针对多目标决策问题中指标权重难以准确度量、传统的赋权方法始终无法达到主客观的有机统一的现状，引入证据理论中的合成规则，通过构建一种耦合模型将传统的主客观组合赋权模型进行有效合成，所得结果不仅满足指标的客观性，同时也体现了决策者的主观偏好，合成结果与传统的线性组合相比，所求指标权重更加突出定量指标的客观性和定性指标的主观性。最后通过实例论证，证明了该耦合模型可以良好地运用在多目标决策问题中。%In response to a situation that it is not easy to accurately compute the indicator weight in multiple objec-tive decisions, and it is unable to achieve effective harmonization of subjective and objective when the traditional methods were used to calculate the weights, the combination rule of evidence theory was introduced in this article, which was used to combine the weights in traditional method by using the combination weighting model effectively by setting a coupled model, and the results were not only objective but also fit the subjective preferences of decision makers.The weights computed by the combination rule were more advanced, because the objectivity of the quanti-tative indicators and the subjectivity of qualitative indicators were highlighted when compared with those results of linear combination.At last, the coupled model proveds that it can be effectively used in multiple objective decision issues by an example argument.
Applying learning theories and instructional design models for effective instruction.
Khalil, Mohammed K; Elkhider, Ihsan A
2016-06-01
Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory. Copyright © 2016 The American Physiological Society.
An imprecise probability approach for squeal instability analysis based on evidence theory
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-01-01
An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.
Epistemic uncertainty quantification in flutter analysis using evidence theory
Institute of Scientific and Technical Information of China (English)
Tang Jian; Wu Zhigang; Yang Chao
2015-01-01
Aimed at evaluating the structural stability and flutter risk of the system, this paper man-ages to quantify epistemic uncertainty in flutter analysis using evidence theory, including both para-metric uncertainty and method selection uncertainty, on the basis of information from limited experimental data of uncertain parameters. Two uncertain variables of the actuator coupling system with unknown probability distributions, that is bending and torsional stiffness, which are both described with multiple intervals and the basic belief assignment (BBA) extricated from the modal test of actuator coupling systems, are taken into account. Considering the difference in dealing with experimental data by different persons and the reliability of various information sources, a new combination rule of evidence––the generalized lower triangular matrices method is formed to acquire the combined BBA. Finally the parametric uncertainty and the epistemic uncertainty of flut-ter analysis method selection are considered in the same system to realize quantification. A typical rudder of missile is selected to examine the present method, and the dangerous range of velocity as well as relevant belief and plausibility functions is obtained. The results suggest that the present method is effective in obtaining the lower and upper bounds of flutter probability and assessing flut-ter risk of structures with limited experimental data of uncertain parameters and the belief of dif-ferent methods.
Constraint theory multidimensional mathematical model management
Friedman, George J
2017-01-01
Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...
Modeling Forensic Evidence Systems Using Design Science
Armstrong, Colin; Armstrong, Helen
2010-01-01
International audience; This paper presents an overview of the application of design science research to the tactical management of forensic evidence processing. The opening discussion addresses the application of design science techniques to specific socio-technical information systems research in regard to processing forensic evidence. The discussion then presents the current problems faced by those dealing with evidence and a conceptual meta-model for a unified approach to forensic evidenc...
The Nomad Model: Theory, Developments and Applications
Campanella, M.; Hoogendoorn, S.P.; Daamen, W.
2014-01-01
This paper presents details of the developments of the Nomad model after being introduced more than 12 years ago. The model is derived from a normative theory of pedestrian behavior making it unique under microscopic models. Nomad has been successfully applied in several cases indicating that it ful
Integrable Models, SUSY Gauge Theories, and String Theory
Nam, S
1996-01-01
We consider the close relation between duality in N=2 SUSY gauge theories and integrable models. Vario us integrable models ranging from Toda lattices, Calogero models, spinning tops, and spin chains are re lated to the quantum moduli space of vacua of N=2 SUSY gauge theories. In particular, SU(3) gauge t heories with two flavors of massless quarks in the fundamental representation can be related to the spec tral curve of the Goryachev-Chaplygin top, which is a Nahm's equation in disguise. This can be generaliz ed to the cases with massive quarks, and N_f = 0,1,2, where a system with seven dimensional phas e space has the relevant hyperelliptic curve appear in the Painlevé test. To understand the stringy o rigin of the integrability of these theories we obtain exact nonperturbative point particle limit of ty pe II string compactified on a Calabi-Yau manifold, which gives the hyperelliptic curve of SU(2) QCD w ith N_f =1 hypermultiplet.
The Free-Rider Paradox: Theory, Evidence, and Teaching.
Asch, Peter; Gigliotti, Gary A.
1991-01-01
Discusses the conventional theory of free riding as discussed in economic textbooks. Argues the theory is empirically invalid, and reviews various scholarly viewpoints on this issue. Suggests alternatives to teaching current economic theory and argues that the concept of self-interest neglects the ethical issues in behavior. (NL)
A course on basic model theory
Sarbadhikari, Haimanti
2017-01-01
This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.
Lattice Gauge Theories and Spin Models
Mathur, Manu
2016-01-01
The Wegner $Z_2$ gauge theory-$Z_2$ Ising spin model duality in $(2+1)$ dimensions is revisited and derived through a series of canonical transformations. These $Z_2$ results are directly generalized to SU(N) lattice gauge theory in $(2+1)$ dimensions to obtain a dual SU(N) spin model in terms of the SU(N) magnetic fields and electric scalar potentials. The gauge-spin duality naturally leads to a new gauge invariant disorder operator for SU(N) lattice gauge theory. A variational ground state of the dual SU(2) spin model with only nearest neighbour interactions is constructed to analyze SU(2) lattice gauge theory.
Gauge theories and integrable lattice models
Witten, Edward
1989-08-01
Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question — previously considered in both the knot theory and statistical mechanics — are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be presented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory.
Improvement method for the combining rule of Dempster-Shafer evidence theory based on reliability
Institute of Scientific and Technical Information of China (English)
Wang Ping; Yang Genqing
2005-01-01
An improvement method for the combining rule of Dempster evidence theory is proposed. Different from Dempster theory, the reliability of evidences isn't identical; and varies with the event. By weight evidence according to their reliability, the effect of unreliable evidence is reduced, and then get the fusion result that is closer to the truth. An example to expand the advantage of this method is given. The example proves that this method is helpful to find a correct result.
Modeling Techniques: Theory and Practice
Odd A. Asbjørnsen
1985-01-01
A survey is given of some crucial concepts in chemical process modeling. Those are the concepts of physical unit invariance, of reaction invariance and stoichiometry, the chromatographic effect in heterogeneous systems, the conservation and balance principles and the fundamental structures of cause and effect relationships. As an example, it is shown how the concept of reaction invariance may simplify the homogeneous reactor modeling to a large extent by an orthogonal decomposition of the pro...
Theory of chaotic orbital variations confirmed by Cretaceous geological evidence.
Ma, Chao; Meyers, Stephen R; Sageman, Bradley B
2017-02-22
Variations in the Earth's orbit and spin vector are a primary control on insolation and climate; their recognition in the geological record has revolutionized our understanding of palaeoclimate dynamics, and has catalysed improvements in the accuracy and precision of the geological timescale. Yet the secular evolution of the planetary orbits beyond 50 million years ago remains highly uncertain, and the chaotic dynamical nature of the Solar System predicted by theoretical models has yet to be rigorously confirmed by well constrained (radioisotopically calibrated and anchored) geological data. Here we present geological evidence for a chaotic resonance transition associated with interactions between the orbits of Mars and the Earth, using an integrated radioisotopic and astronomical timescale from the Cretaceous Western Interior Basin of what is now North America. This analysis confirms the predicted chaotic dynamical behaviour of the Solar System, and provides a constraint for refining numerical solutions for insolation, which will enable a more precise and accurate geological timescale to be produced.
Theory of chaotic orbital variations confirmed by Cretaceous geological evidence
Ma, Chao; Meyers, Stephen R.; Sageman, Bradley B.
2017-02-01
Variations in the Earth’s orbit and spin vector are a primary control on insolation and climate; their recognition in the geological record has revolutionized our understanding of palaeoclimate dynamics, and has catalysed improvements in the accuracy and precision of the geological timescale. Yet the secular evolution of the planetary orbits beyond 50 million years ago remains highly uncertain, and the chaotic dynamical nature of the Solar System predicted by theoretical models has yet to be rigorously confirmed by well constrained (radioisotopically calibrated and anchored) geological data. Here we present geological evidence for a chaotic resonance transition associated with interactions between the orbits of Mars and the Earth, using an integrated radioisotopic and astronomical timescale from the Cretaceous Western Interior Basin of what is now North America. This analysis confirms the predicted chaotic dynamical behaviour of the Solar System, and provides a constraint for refining numerical solutions for insolation, which will enable a more precise and accurate geological timescale to be produced.
Supplier-induced demand: reconsidering the theories and new Australian evidence.
Richardson, Jeffrey R J; Peacock, Stuart J
2006-01-01
This paper reconsiders the evidence and several of the key arguments associated with the theory of supplier-induced demand (SID). It proposes a new theory to explain how ethical behaviour is consistent with SID. The purpose of a theory of demand and one criterion for the evaluation of a theory is the provision of a plausible explanation for the observed variability in service use. We argue that Australian data are not easily explained by orthodox possible explanation. We also argue that, having revisited the theory of SID, the agency relationship between doctors and patients arises not simply because of asymmetrical information but from an asymmetrical ability and willingness to exercise judgement in the face of uncertainty. It is also argued that the incomplete demand shift that must occur following an increase in the doctor supply is readily explained by the dynamics of market adjustment when market information is incomplete and there is non-collusive professional (and ethical) behaviour by doctors. Empirical evidence of SID from six Australian data sets is presented and discussed. It is argued that these are more easily explained by SID than by conventional demand side variables. We conclude that once the uncertainty of medical decision making and the complexity of medical judgements are taken into account, SID is a more plausible theory of patient and doctor behaviour than the orthodox model of demand and supply. More importantly, SID provides a satisfactory explanation of the observed pattern and change in the demand for Australian medical services, which are not easily explained in the absence of SID.
Modeling Techniques: Theory and Practice
Directory of Open Access Journals (Sweden)
Odd A. Asbjørnsen
1985-07-01
Full Text Available A survey is given of some crucial concepts in chemical process modeling. Those are the concepts of physical unit invariance, of reaction invariance and stoichiometry, the chromatographic effect in heterogeneous systems, the conservation and balance principles and the fundamental structures of cause and effect relationships. As an example, it is shown how the concept of reaction invariance may simplify the homogeneous reactor modeling to a large extent by an orthogonal decomposition of the process variables. This allows residence time distribution function parameters to be estimated with the reaction in situ, but without any correlation between the estimated residence time distribution parameters and the estimated reaction kinetic parameters. A general word of warning is given to the choice of wrong mathematical structure of models.
Grey-theory based intrusion detection model
Institute of Scientific and Technical Information of China (English)
Qin Boping; Zhou Xianwei; Yang Jun; Song Cunyi
2006-01-01
To solve the problem that current intrusion detection model needs large-scale data in formulating the model in real-time use, an intrusion detection system model based on grey theory (GTIDS) is presented. Grey theory has merits of fewer requirements on original data scale, less limitation of the distribution pattern and simpler algorithm in modeling.With these merits GTIDS constructs model according to partial time sequence for rapid detect on intrusive act in secure system. In this detection model rate of false drop and false retrieval are effectively reduced through twice modeling and repeated detect on target data. Furthermore, GTIDS framework and specific process of modeling algorithm are presented. The affectivity of GTIDS is proved through emulated experiments comparing snort and next-generation intrusion detection expert system (NIDES) in SRI international.
Graphical Model Theory for Wireless Sensor Networks
Energy Technology Data Exchange (ETDEWEB)
Davis, William B.
2002-12-08
Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.
F-theory and linear sigma models
Bershadsky, M; Greene, Brian R; Johansen, A; Lazaroiu, C I
1998-01-01
We present an explicit method for translating between the linear sigma model and the spectral cover description of SU(r) stable bundles over an elliptically fibered Calabi-Yau manifold. We use this to investigate the 4-dimensional duality between (0,2) heterotic and F-theory compactifications. We indirectly find that much interesting heterotic information must be contained in the `spectral bundle' and in its dual description as a gauge theory on multiple F-theory 7-branes. A by-product of these efforts is a method for analyzing semistability and the splitting type of vector bundles over an elliptic curve given as the sheaf cohomology of a monad.
Spreading Models in Banach Space Theory
Argyros, S A; Tyros, K
2010-01-01
We extend the classical Brunel-Sucheston definition of the spreading model by introducing the $\\mathcal{F}$-sequences $(x_s)_{s\\in\\mathcal{F}}$ in a Banach space and the plegma families in $\\mathcal{F}$ where $\\mathcal{F}$ is a regular thin family. The new concept yields a transfinite increasing hierarchy of classes of 1-subsymmetric sequences. We explore the corresponding theory and we present examples establishing this hierarchy and illustrating the limitation of the theory.
Integrable Lattice Models From Gauge Theory
Witten, Edward
2016-01-01
These notes provide an introduction to recent work by Kevin Costello in which integrable lattice models of classical statistical mechanics in two dimensions are understood in terms of quantum gauge theory in four dimensions. This construction will be compared to the more familiar relationship between quantum knot invariants in three dimensions and Chern-Simons gauge theory. (Based on a Whittaker Colloquium at the University of Edinburgh and a lecture at Strings 2016 in Beijing.)
Security Theorems via Model Theory
Directory of Open Access Journals (Sweden)
Joshua Guttman
2009-11-01
Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.
J. Brug (Hans); A. Oenema (Anke); A. Ferreira (Isabel)
2005-01-01
textabstractBACKGROUND: The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. DISCUSSION: Since behavior theory is a reflection of the compiled evidence of behavior research, theory is th
Semantic Modelling of Digital Forensic Evidence
Kahvedžić, Damir; Kechadi, Tahar
The reporting of digital investigation results are traditionally carried out in prose and in a large investigation may require successive communication of findings between different parties. Popular forensic suites aid in the reporting process by storing provenance and positional data but do not automatically encode why the evidence is considered important. In this paper we introduce an evidence management methodology to encode the semantic information of evidence. A structured vocabulary of terms, ontology, is used to model the results in a logical and predefined manner. The descriptions are application independent and automatically organised. The encoded descriptions aim to help the investigation in the task of report writing and evidence communication and can be used in addition to existing evidence management techniques.
Vacation queueing models theory and applications
Tian, Naishuo
2006-01-01
A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...
Some Remarks on the Model Theory of Epistemic Plausibility Models
Demey, Lorenz
2010-01-01
Classical logics of knowledge and belief are usually interpreted on Kripke models, for which a mathematically well-developed model theory is available. However, such models are inadequate to capture dynamic phenomena. Therefore, epistemic plausibility models have been introduced. Because these are much richer structures than Kripke models, they do not straightforwardly inherit the model-theoretical results of modal logic. Therefore, while epistemic plausibility structures are well-suited for modeling purposes, an extensive investigation of their model theory has been lacking so far. The aim of the present paper is to fill exactly this gap, by initiating a systematic exploration of the model theory of epistemic plausibility models. Like in 'ordinary' modal logic, the focus will be on the notion of bisimulation. We define various notions of bisimulations (parametrized by a language L) and show that L-bisimilarity implies L-equivalence. We prove a Hennesy-Milner type result, and also two undefinability results. ...
Thimble regularization at work besides toy models: from Random Matrix Theory to Gauge Theories
Eruzzi, G
2015-01-01
Thimble regularization as a solution to the sign problem has been successfully put at work for a few toy models. Given the non trivial nature of the method (also from the algorithmic point of view) it is compelling to provide evidence that it works for realistic models. A Chiral Random Matrix theory has been studied in detail. The known analytical solution shows that the model is non-trivial as for the sign problem (in particular, phase quenched results can be very far away from the exact solution). This study gave us the chance to address a couple of key issues: how many thimbles contribute to the solution of a realistic problem? Can one devise algorithms which are robust as for staying on the correct manifold? The obvious step forward consists of applications to gauge theories.
Spahn, Joanne M; Reeves, Rebecca S; Keim, Kathryn S; Laquatra, Ida; Kellogg, Molly; Jortberg, Bonnie; Clark, Nicole A
2010-06-01
Behavior change theories and models, validated within the field of dietetics, offer systematic explanations for nutrition-related behavior change. They are integral to the nutrition care process, guiding nutrition assessment, intervention, and outcome evaluation. The American Dietetic Association Evidence Analysis Library Nutrition Counseling Workgroup conducted a systematic review of peer-reviewed literature related to behavior change theories and strategies used in nutrition counseling. Two hundred fourteen articles were reviewed between July 2007 and March 2008, and 87 studies met the inclusion criteria. The workgroup systematically evaluated these articles and formulated conclusion statements and grades based upon the available evidence. Strong evidence exists to support the use of a combination of behavioral theory and cognitive behavioral theory, the foundation for cognitive behavioral therapy (CBT), in facilitating modification of targeted dietary habits, weight, and cardiovascular and diabetes risk factors. Evidence is particularly strong in patients with type 2 diabetes receiving intensive, intermediate-duration (6 to 12 months) CBT, and long-term (>12 months duration) CBT targeting prevention or delay in onset of type 2 diabetes and hypertension. Few studies have assessed the application of the transtheoretical model on nutrition-related behavior change. Little research was available documenting the effectiveness of nutrition counseling utilizing social cognitive theory. Motivational interviewing was shown to be a highly effective counseling strategy, particularly when combined with CBT. Strong evidence substantiates the effectiveness of self-monitoring and meal replacements and/or structured meal plans. Compelling evidence exists to demonstrate that financial reward strategies are not effective. Goal setting, problem solving, and social support are effective strategies, but additional research is needed in more diverse populations. Routine documentation
Introducing Evidence Through Research "Push": Using Theory and Qualitative Methods.
Morden, Andrew; Ong, Bie Nio; Brooks, Lauren; Jinks, Clare; Porcheret, Mark; Edwards, John J; Dziedzic, Krysia S
2015-11-01
A multitude of factors can influence the uptake and implementation of complex interventions in health care. A plethora of theories and frameworks recognize the need to establish relationships, understand organizational dynamics, address context and contingency, and engage key decision makers. Less attention is paid to how theories that emphasize relational contexts can actually be deployed to guide the implementation of an intervention. The purpose of the article is to demonstrate the potential role of qualitative research aligned with theory to inform complex interventions. We detail a study underpinned by theory and qualitative research that (a) ensured key actors made sense of the complex intervention at the earliest stage of adoption and (b) aided initial engagement with the intervention. We conclude that using theoretical approaches aligned with qualitative research can provide insights into the context and dynamics of health care settings that in turn can be used to aid intervention implementation.
Racial Threat Theory: Assessing the Evidence, Requesting Redesign
Directory of Open Access Journals (Sweden)
Cindy Brooks Dollar
2014-01-01
Full Text Available Racial threat theory was developed as a way to explain how population composition influences discriminatory social control practices and has become one of the most acknowledged frameworks for explaining racial disparity in criminal justice outcomes. This paper provides a thorough review of racial threat theory and empirical assessments of the theory and demonstrates that while scholars often cite inconsistent support for the theory, empirical discrepancies may be due to insufficient attention to the conceptual complexity of racial threat. I organize and present the following review around 4 forms of state-sanctioned control mechanisms: police expenditures, arrests, sentencing, and capital punishment. Arguing that the pervasiveness of racialization in state controls warrants continued inquiry, I provide suggestions for future scholarship that will help us develop enhanced understanding of how racial threat may be operating.
Dissecting Practical Intelligence Theory: Its Claims and Evidence.
Gottfredson, Linda S.
2003-01-01
The two key theoretical propositions of "Practical Intelligence in Everyday Life" are made plausible only if one ignores considerable evidence contradicting them. The six key empirical claims rest primarily on the illusion of evidence enhanced by selective reporting of results. (SLD)
Dai, Shengli; Zhang, Hailin
2014-01-01
Based on Theory of Evidence and reviewing research papers concerned, a concept model of knowledge sharing network among industrial cluster firms, which can be applied to assess knowledge sharing capacity, has been built. Next, the authors create a set of assessment index systems including twelve subindexes under four principle indexes. In this study, ten experts in the same field were invited to score all the indexes of knowledge sharing capacity concerning one certain industrial cluster. The research result shows relatively high knowledge network sharing capacity among the certain industrial cluster firms. Another conclusion is that the assessment method with Theory of Evidence is feasible to conduct such a research.
Directory of Open Access Journals (Sweden)
Shengli Dai
2014-01-01
Full Text Available Based on Theory of Evidence and reviewing research papers concerned, a concept model of knowledge sharing network among industrial cluster firms, which can be applied to assess knowledge sharing capacity, has been built. Next, the authors create a set of assessment index systems including twelve subindexes under four principle indexes. In this study, ten experts in the same field were invited to score all the indexes of knowledge sharing capacity concerning one certain industrial cluster. The research result shows relatively high knowledge network sharing capacity among the certain industrial cluster firms. Another conclusion is that the assessment method with Theory of Evidence is feasible to conduct such a research.
van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie
2017-09-01
This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.
Supersymmetric Microscopic Theory of the Standard Model
Ter-Kazarian, G T
2000-01-01
We promote the microscopic theory of standard model (MSM, hep-ph/0007077) into supersymmetric framework in order to solve its technical aspects of vacuum zero point energy and hierarchy problems, and attempt, further, to develop its realistic viable minimal SUSY extension. Among other things that - the MSM provides a natural unification of geometry and the field theory, has clarified the physical conditions in which the geometry and particles come into being, in microscopic sense enables an insight to key problems of particle phenomenology and answers to some of its nagging questions - a present approach also leads to quite a new realization of the SUSY yielding a physically realistic particle spectrum. It stems from the special subquark algebra, from which the nilpotent supercharge operators are derived. The resulting theory makes plausible following testable implications for the current experiments at LEP2, at the Tevatron and at LHC drastically different from those of the conventional MSSM models: 1. All t...
Engaging Theories and Models to Inform Practice
Kraus, Amanda
2012-01-01
Helping students prepare for the complex transition to life after graduation is an important responsibility shared by those in student affairs and others in higher education. This chapter explores theories and models that can inform student affairs practitioners and faculty in preparing students for life after college. The focus is on roles,…
Recursive renormalization group theory based subgrid modeling
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
Evidence for an expectancy-based theory of avoidance behaviour.
Declercq, Mieke; De Houwer, Jan; Baeyens, Frank
2008-01-01
In most studies on avoidance learning, participants receive an aversive unconditioned stimulus after a warning signal is presented, unless the participant performs a particular response. Lovibond (2006) recently proposed a cognitive theory of avoidance learning, according to which avoidance behaviour is a function of both Pavlovian and instrumental conditioning. In line with this theory, we found that avoidance behaviour was based on an integration of acquired knowledge about, on the one hand, the relation between stimuli and, on the other hand, the relation between behaviour and stimuli.
The danger model: questioning an unconvincing theory.
Józefowski, Szczepan
2016-02-01
Janeway's pattern recognition theory holds that the immune system detects infection through a limited number of the so-called pattern recognition receptors (PRRs). These receptors bind specific chemical compounds expressed by entire groups of related pathogens, but not by host cells (pathogen-associated molecular patterns (PAMPs). In contrast, Matzinger's danger hypothesis postulates that products released from stressed or damaged cells have a more important role in the activation of immune system than the recognition of nonself. These products, named by analogy to PAMPs as danger-associated molecular patterns (DAMPs), are proposed to act through the same receptors (PRRs) as PAMPs and, consequently, to stimulate largely similar responses. Herein, I review direct and indirect evidence that contradict the widely accepted danger theory, and suggest that it may be false.
Assessing landslide susceptibility by applying fuzzy sets, possibility evidence-based theories
Directory of Open Access Journals (Sweden)
Ibsen Chivatá Cárdenas
2010-04-01
Full Text Available A landslide susceptibility model was developed for the city of Manizales, Colombia; landslides have been the city’s main environmental problem. Fuzzy sets and possibility and evidence-based theories were used to construct the mo-del due to the set of circumstances and uncertainty involved in the modelling; uncertainty particularly concerned the lack of representative data and the need for systematically coordinating subjective information. Susceptibility and the uncertainty were estimated via data processing; the model contained data concerning mass vulnerability and uncer-tainty. Output data was expressed on a map defined by linguistic categories or uncertain labels as having low, me-dium, high and very high susceptibility; this was considered appropriate for representing susceptibility. A fuzzy spec-trum was developed for classifying susceptibility levels according to perception and expert opinion. The model sho-wed levels of susceptibility in the study area, ranging from low to high susceptibility (medium susceptibility being mo-re frequent. This article shows the details concerning systematic data processing by presenting theories and tools regarding uncertainty. The concept of fuzzy parameters is introduced; this is useful in modelling phenomena regar-ding uncertainty, complexity and nonlinear performance, showing that susceptibility modelling can be feasible. The paper also shows the great convenience of incorporating uncertainty into modelling and decision-making. However, quantifying susceptibility is not suitable when modelling identified uncertainty because incorporating model output information cannot be reduced into exact or real numerical quantities when the nature of the variables is particularly uncertain. The latter concept is applicable to risk assessment.
Directory of Open Access Journals (Sweden)
Carol A. Gordon
2009-06-01
Full Text Available Objective – Part I of this paper aims to create a framework for an emerging theory of evidence based information literacy instruction. In order to ground this framework in existing theory, a holistic perspective views inquiry as a learning process that synthesizes information searching and knowledge building. An interdisciplinary approach is taken to relate user-centric information behavior theory and constructivist learning theory that supports this synthesis. The substantive theories that emerge serve as a springboard for emerging theory. A second objective of this paper is to define evidence based information literacy instruction by assessing the suitability of performance based assessment and action research as tools of evidence based practice.Methods – An historical review of research grounded in user-centered information behavior theory and constructivist learning theory establishes a body of existing substantive theory that supports emerging theory for evidence based information literacy instruction within an information-to-knowledge approach. A focused review of the literature presents supporting research for an evidence based pedagogy that is performance assessment based, i.e., information users are immersed in real-world tasks that include formative assessments. An analysis of the meaning of action research in terms of its purpose and methodology establishes its suitability for structuring an evidence based pedagogy. Supporting research tests a training model for school librarians and educators which integrates performance based assessment, as well as action research. Results – Findings of an historical analysis of information behavior theory and constructivist teaching practices, and a literature review that explores teaching models for evidence based information literacy instruction, point to two elements of evidence based information literacy instruction: the micro level of information searching behavior and the macro level of
On the Role of Theory and Evidence in Macroeconomics
DEFF Research Database (Denmark)
Juselius, Katarina
This paper, which is prepared for the Inagural Conference of the Institute for New Economic Thinking in King's College, Cambridge, 8-11 April 2010, questions the preeminence of theory over empirics in economics and argues that empirical econometrics needs to be given a more important and independ...
On the Role of Theory and Evidence in Macroeconomics
DEFF Research Database (Denmark)
Juselius, Katarina
This paper, which is prepared for the Inagural Conference of the Institute for New Economic Thinking in King's College, Cambridge, 8-11 April 2010, questions the preeminence of theory over empirics in economics and argues that empirical econometrics needs to be given a more important and independ...
Evidence for the Validity of Situational Leadership Theory.
Walter, James E.; And Others
1980-01-01
Preliminary findings from the administration of the Leader Effectiveness and Adaptability Description (LEAD) instrument provide some support for situational leadership theory--the idea that flexible and balanced use of task and relationship behaviors is beneficial for both organizational productivity and personal satisfaction. (Author/MLF)
Evidence for the Validity of Situational Leadership Theory.
Walter, James E.; And Others
1980-01-01
Preliminary findings from the administration of the Leader Effectiveness and Adaptability Description (LEAD) instrument provide some support for situational leadership theory--the idea that flexible and balanced use of task and relationship behaviors is beneficial for both organizational productivity and personal satisfaction. (Author/MLF)
Lattice gauge theories and spin models
Mathur, Manu; Sreeraj, T. P.
2016-10-01
The Wegner Z2 gauge theory-Z2 Ising spin model duality in (2 +1 ) dimensions is revisited and derived through a series of canonical transformations. The Kramers-Wannier duality is similarly obtained. The Wegner Z2 gauge-spin duality is directly generalized to SU(N) lattice gauge theory in (2 +1 ) dimensions to obtain the SU(N) spin model in terms of the SU(N) magnetic fields and their conjugate SU(N) electric scalar potentials. The exact and complete solutions of the Z2, U(1), SU(N) Gauss law constraints in terms of the corresponding spin or dual potential operators are given. The gauge-spin duality naturally leads to a new gauge invariant magnetic disorder operator for SU(N) lattice gauge theory which produces a magnetic vortex on the plaquette. A variational ground state of the SU(2) spin model with nearest neighbor interactions is constructed to analyze SU(2) gauge theory.
Introducing AORN's new model for evidence rating.
Spruce, Lisa; Van Wicklin, Sharon A; Hicks, Rodney W; Conner, Ramona; Dunn, Debra
2014-02-01
Nurses today are expected to implement evidence-based practices in the perioperative setting to assess and implement practice changes. All evidence-based practice begins with a question, a practice problem to address, or a needed change that is identified. To assess the question, a literature search is performed and relevant literature is identified and appraised. The types of evidence used to inform practice can be scientific research (eg, randomized controlled trials, systematic reviews) or nonresearch evidence (eg, regulatory and accrediting agency requirements, professional association practice standards and guidelines, quality improvement project reports). The AORN recommended practices are a synthesis of related knowledge on a given topic, and the authorship process begins with a systematic review of the literature conducted in collaboration with a medical librarian. At least two appraisers independently evaluate the applicable literature for quality and strength by using the AORN Research Appraisal Tool and AORN Non-Research Appraisal Tool. To collectively appraise the evidence supporting particular practice recommendations, the AORN recommended practices authors have implemented a new evidence rating model that is appropriate for research and nonresearch literature and that is relevant to the perioperative setting.
F-theory and linear sigma models
Energy Technology Data Exchange (ETDEWEB)
Bershadsky, M.; Johansen, A. [Harvard Univ., Cambridge, MA (United States). Lyman Lab. of Physics; Chiang, T.M. [Newman Laboratory of Nuclear Studies, Cornell University, Ithaca, NY 14850 (United States); Greene, B.R.; Lazaroiu, C.I. [Departments of Physics and Mathematics, Columbia University, New York, NY 10027 (United States)
1998-09-07
We present an explicit method for translating between the linear sigma model and the spectral cover description of SU(r) stable bundles over an elliptically fibered Calabi-Yau manifold. We use this to investigate the four-dimensional duality between (0,2) heterotic and F-theory compactifications. We indirectly find that much interesting heterotic information must be contained in the `spectral bundle` and in its dual description as a gauge theory on multiple F-theory 7-branes. A by-product of these efforts is a method for analyzing semistability and the splitting type of vector bundles over an elliptic curve given as the sheaf cohomology of a monad. (orig.) 24 refs.
Microscopic Theory of the Standard Model
Ter-Kazarian, G T
2000-01-01
The operator manifold formalism (part I) enables the unification of the geometry and the field theory, and yields the quantization of geometry. This is the mathematical framework for our physical outlook that the geometry and fields, with the internal symmetries and all interactions, as well the four major principles of relativity (special and general), quantum, gauge and colour confinement, are derivative, and come into being simultaneously in the stable system of the underlying ``primordial structures''. In part II we attempt to develop, further, the microscopic approach to the Standard Model of particle physics, which enables an insight to the key problems of particle phenomenology. We suggest the microscopic theory of the unified electroweak interactions. The Higgs bosons have arisen on an analogy of the Cooper pairs in superconductivity. Besides of microscopic interpretation of all physical parameters the resulting theory also makes plausible following testable implications for the current experiments: 1...
Directory of Open Access Journals (Sweden)
Maclennan Graeme
2010-04-01
Full Text Available Abstract Background Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Methods Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs in Scotland. Outcomes were behavioural simulation (scenario decision-making, and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model, and knowledge (a non-theoretical construct. Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value Results Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT, timeline acute (CS-SRM, and outcome expectancy (SCT entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT and attitude (TPB entered the equation, together explaining 68% of the variance in intention. Summary The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for
Bonetti, Debbie; Johnston, Marie; Clarkson, Jan E; Grimshaw, Jeremy; Pitts, Nigel B; Eccles, Martin; Steen, Nick; Thomas, Ruth; Maclennan, Graeme; Glidewell, Liz; Walker, Anne
2010-04-08
Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs) in Scotland. Outcomes were behavioural simulation (scenario decision-making), and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model, and knowledge (a non-theoretical construct). Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value. Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT), timeline acute (CS-SRM), and outcome expectancy (SCT) entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT) and attitude (TPB) entered the equation, together explaining 68% of the variance in intention. The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for identifying factors that may predict clinical behaviour
Crack propagation modeling using Peridynamic theory
Hafezi, M. H.; Alebrahim, R.; Kundu, T.
2016-04-01
Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.
Strategic behavior and marriage payments: theory and evidence from Senegal.
Gaspart, Frederic; Platteau, Jean-Philippe
2010-01-01
This article proposes an original theory of marriage payments based on insights gained from firsthand information collected in the Senegal River valley. This theory postulates that decisions about the bride-price, which are made by the bride's father, take into account the likely effects of the amount set on the risk of ill-treatment of the wife and the risk of marriage failure. Based on a sequential game with three players (the bride's father, the husband, and the wife) and a matching process, it leads to a number of important predictions that are tested against Senegalese data relating to bride-prices and various characteristics of women. The empirical results confirm that parents behave strategically by keeping bride-prices down so as to reduce the risk of marriage failure for their daughters. Other interesting effects on marriage payments and the probability of separation are also highlighted, stressing the role of the bride's bargaining power in her own family.
Blue Ocean versus Competitive Strategy: Theory and Evidence
A.E. Burke (Andrew); A.J. van Stel (André); A.R. Thurik (Roy)
2009-01-01
textabstractBlue ocean strategy seeks to turn strategic management on its head by replacing ‘competitive advantage’ with ‘value innovation’ as the primary goal where firms must create consumer demand and exploit untapped markets. Empirical analysis has been focused on case study evidence and so
Blue Ocean versus Competitive Strategy: Theory and Evidence
A.E. Burke (Andrew); A.J. van Stel (André); A.R. Thurik (Roy)
2009-01-01
textabstractBlue ocean strategy seeks to turn strategic management on its head by replacing ‘competitive advantage’ with ‘value innovation’ as the primary goal where firms must create consumer demand and exploit untapped markets. Empirical analysis has been focused on case study evidence and so lack
Topos models for physics and topos theory
Wolters, Sander
2014-08-01
What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a "quantum logic" in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.
International Migration with Heterogeneous Agents: Theory and Evidence
DEFF Research Database (Denmark)
Schröder, Philipp J.H.; Brücker, Herbert
Temporary migration, though empirically relevant, is often ignored in formal models. This paper proposes a migration model with heterogeneous agents and persistent cross country income differentials that features temporary migration. In equilibrium there exists a positive relation between the stock...
A Membrane Model from Implicit Elasticity Theory
Freed, A. D.; Liao, J.; Einstein, D. R.
2014-01-01
A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079
Synthetic Domain Theory and Models of Linear Abadi & Plotkin Logic
DEFF Research Database (Denmark)
Møgelberg, Rasmus Ejlers; Birkedal, Lars; Rosolini, Guiseppe
2008-01-01
Plotkin suggested using a polymorphic dual intuitionistic/linear type theory (PILLY) as a metalanguage for parametric polymorphism and recursion. In recent work the first two authors and R.L. Petersen have defined a notion of parametric LAPL-structure, which are models of PILLY, in which one can...... reason using parametricity and, for example, solve a large class of domain equations, as suggested by Plotkin.In this paper, we show how an interpretation of a strict version of Bierman, Pitts and Russo's language Lily into synthetic domain theory presented by Simpson and Rosolini gives rise...... to a parametric LAPL-structure. This adds to the evidence that the notion of LAPL-structure is a general notion, suitable for treating many different parametric models, and it provides formal proofs of consequences of parametricity expected to hold for the interpretation. Finally, we show how these results...
Theory, Modeling and Simulation Annual Report 2000
Energy Technology Data Exchange (ETDEWEB)
Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.
2001-11-01
This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.
Theory, Modeling and Simulation Annual Report 2000
Energy Technology Data Exchange (ETDEWEB)
Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A
2001-11-01
This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.
Global Sourcing of Heterogeneous Firms: Theory and Evidence
DEFF Research Database (Denmark)
Kohler, Wilhelm; Smolka, Marcel
2015-01-01
The share of international trade within firm boundaries varies greatly across countries. This column presents new evidence on how the productivity of a firm affects the choice between vertical integration and outsourcing, as well as between foreign and domestic sourcing. The productivity effects...... found in Spanish firm-level data suggest that contractual imperfections distort the sourcing of inputs in the global economy, and that firm boundaries emerge in response to mitigate this distortion....
Global Sourcing of Heterogeneous Firms: Theory and Evidence
DEFF Research Database (Denmark)
Kohler, Wilhelm; Smolka, Marcel
2015-01-01
The share of international trade within firm boundaries varies greatly across countries. This column presents new evidence on how the productivity of a firm affects the choice between vertical integration and outsourcing, as well as between foreign and domestic sourcing. The productivity effects...... found in Spanish firm-level data suggest that contractual imperfections distort the sourcing of inputs in the global economy, and that firm boundaries emerge in response to mitigate this distortion....
Greenslade, Kathryn J.; Coggins, Truman E.
2016-01-01
This study presents an independent replication and extension of psychometric evidence supporting the "Theory of Mind Inventory" ("ToMI"). Parents of 20 children with ASD (4; 1-6; 7 years; months) and 20 with typical development (3; 1-6; 5), rated their child's theory of mind abilities in everyday situations. Other parent report…
What is the Role of Legal Systems in Financial Intermediation? Theory and Evidence
Bottazzi, L.; Da Rin, M.; Hellmann, T.
2008-01-01
We develop a theory and empirical test of how the legal system affects the relationship between venture capitalists and entrepreneurs. The theory uses a double moral hazard framework to show how optimal contracts and investor actions depend on the quality of the legal system. The empirical evidence
[Evidence in support of Florence Nightingale's theories. 100 years after her death].
Zapico Yáñez, Florentina
2010-05-01
This article has been written to pay homage to Florence Nightingale as a pioneer and beacon for scientific curiosity which should serve as the nursing professional's guide for fine praxis. For this purpose, the author studied the evidence regarding Ms Nightingale's supposed theories; these have been arranged into four large groups in order to make those findings which support her theories easier to understand.
What is the Role of Legal Systems in Financial Intermediation? Theory and Evidence
Bottazzi, L.; Da Rin, M.; Hellmann, T.
2008-01-01
We develop a theory and empirical test of how the legal system affects the relationship between venture capitalists and entrepreneurs. The theory uses a double moral hazard framework to show how optimal contracts and investor actions depend on the quality of the legal system. The empirical evidence
A survey of economic theories and field evidence on pro-social behavior
2006-01-01
In recent years, a large number of economic theories have evolved to explain people’s pro-social behavior and the variation in their respective behavior. This paper surveys economic theories on pro-social behavior and presents evidence — mainly from the field — testing these theories. In addition, the survey emphasizes that institutional environment might significantly interact with pro-social preferences and explain some of the variation in observed pro-social behavior.
Directory of Open Access Journals (Sweden)
Glidewell Elizabeth
2007-08-01
Full Text Available Abstract Background Psychological models can be used to understand and predict behaviour in a wide range of settings. However, they have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. The aim of this study was to explore the usefulness of a range of psychological theories to predict health professional behaviour relating to management of upper respiratory tract infections (URTIs without antibiotics. Methods Psychological measures were collected by postal questionnaire survey from a random sample of general practitioners (GPs in Scotland. The outcome measures were clinical behaviour (using antibiotic prescription rates as a proxy indicator, behavioural simulation (scenario-based decisions to managing URTI with or without antibiotics and behavioural intention (general intention to managing URTI without antibiotics. Explanatory variables were the constructs within the following theories: Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-Regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Stage Model (SM, and knowledge (a non-theoretical construct. For each outcome measure, multiple regression analysis was used to examine the predictive value of each theoretical model individually. Following this 'theory level' analysis, a 'cross theory' analysis was conducted to investigate the combined predictive value of all significant individual constructs across theories. Results All theories were tested, but only significant results are presented. When predicting behaviour, at the theory level, OLT explained 6% of the variance and, in a cross theory analysis, OLT 'evidence of habitual behaviour' also explained 6%. When predicting behavioural simulation, at the theory level, the proportion of variance explained was: TPB, 31%; SCT, 26%; II, 6%; OLT, 24%. GPs who reported having already decided to change their management to
Sparse modeling theory, algorithms, and applications
Rish, Irina
2014-01-01
""A comprehensive, clear, and well-articulated book on sparse modeling. This book will stand as a prime reference to the research community for many years to come.""-Ricardo Vilalta, Department of Computer Science, University of Houston""This book provides a modern introduction to sparse methods for machine learning and signal processing, with a comprehensive treatment of both theory and algorithms. Sparse Modeling is an ideal book for a first-year graduate course.""-Francis Bach, INRIA - École Normale Supřieure, Paris
Theory, modeling and simulation: Annual report 1993
Energy Technology Data Exchange (ETDEWEB)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.
Interntional Migration with Heterogeneous Agents: Theory and Evidence
DEFF Research Database (Denmark)
Schröder, Philipp J.H.; Brücker, Herbert
Two puzzling facts of international migration are that only a small share of a sending country's population emigrates and that net migration rates tend to cease over time. This paper addresses these issues in a migration model with heterogeneous agents that features temporary migration....... In equilibrium a positive relation exists between the stock of migrants and the income differential, while the net migration flow becomes zero. Consequently, empirical migration models, estimating net migration flows instead of stocks, may be misspecified. This suspicion appears to be confirmed by our empirical...... investigation of cointegration relationships of flow and stock migration models....
Interntional Migration with Heterogeneous Agents: Theory and Evidence
DEFF Research Database (Denmark)
Schröder, Philipp J.H.; Brücker, Herbert
Two puzzling facts of international migration are that only a small share of a sending country's population emigrates and that net migration rates tend to cease over time. This paper addresses these issues in a migration model with heterogeneous agents that features temporary migration....... In equilibrium a positive relation exists between the stock of migrants and the income differential, while the net migration flow becomes zero. Consequently, empirical migration models, estimating net migration flows instead of stocks, may be misspecified. This suspicion appears to be confirmed by our empirical...... investigation of cointegration relationships of flow and stock migration models....
Preschoolers perform more informative experiments after observing theory-violating evidence.
van Schijndel, Tessa J P; Visser, Ingmar; van Bers, Bianca M C W; Raijmakers, Maartje E J
2015-03-01
This study investigated the effect of evidence conflicting with preschoolers' naive theory on the patterns of their free exploratory play. The domain of shadow size was used--a relatively complex, ecologically valid domain that allows for reliable assessment of children's knowledge. Results showed that all children who observed conflicting evidence performed an unconfounded informative experiment in the beginning of their play, compared with half of the children who observed confirming evidence. Mainly, these experiments were directed at investigating a dimension that was at the core of children's initial theory. Thus, preschoolers were flexible in the type of experiments they performed, but they were less flexible in the content of their investigations.
The theories underpinning rational emotive behaviour therapy: where's the supportive evidence?
MacInnes, Douglas
2004-08-01
This paper examines the underlying theoretical philosophy of one of the most widely used cognitive behaviour therapies, rational emotive behaviour therapy. It examines whether two central theoretical principles are supported by research evidence: firstly, that irrational beliefs lead to dysfunctional emotions and inferences and that rational beliefs lead to functional emotions and inferences and, secondly, that demand beliefs are the primary core irrational belief. The established criteria for evaluating the efficacy of the theories are detailed and used to evaluate the strength of evidence supporting these two assumptions. The findings indicate there is limited evidence to support these theories. Copyright 2004 Elsevier Ltd.
Directory of Open Access Journals (Sweden)
Ferreira Isabel
2005-04-01
Full Text Available Abstract Background The present paper intends to contribute to the debate on the usefulness and barriers in applying theories in diet and physical activity behavior-change interventions. Discussion Since behavior theory is a reflection of the compiled evidence of behavior research, theory is the only foothold we have for the development of behavioral nutrition and physical activity interventions. Application of theory should improve the effectiveness of interventions. However, some of the theories we use lack a strong empirical foundation, and the available theories are not always used in the most effective way. Furthermore, many of the commonly-used theories provide at best information on what needs to be changed to promote healthy behavior, but not on how changes can be induced. Finally, many theories explain behavioral intentions or motivation rather well, but are less well-suited to explaining or predicting actual behavior or behavior change. For more effective interventions, behavior change theory needs to be further developed in stronger research designs and such change-theory should especially focus on how to promote action rather than mere motivation. Since voluntary behavior change requires motivation, ability as well as the opportunity to change, further development of behavior change theory should incorporate environmental change strategies. Conclusion Intervention Mapping may help to further improve the application of theories in nutrition and physical activity behavior change.
An Optimization Model Based on Game Theory
Directory of Open Access Journals (Sweden)
Yang Shi
2014-04-01
Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.
Liao, Jing; Bi, Yaxin; Nugent, Chris
2011-01-01
This paper explores a sensor fusion method applied within smart homes used for the purposes of monitoring human activities in addition to managing uncertainty in sensor-based readings. A three-layer lattice structure has been proposed, which can be used to combine the mass functions derived from sensors along with sensor context. The proposed model can be used to infer activities. Following evaluation of the proposed methodology it has been demonstrated that the Dempster-Shafer theory of evidence can incorporate the uncertainty derived from the sensor errors and the sensor context and subsequently infer the activity using the proposed lattice structure. The results from this study show that this method can detect a toileting activity within a smart home environment with an accuracy of 88.2%.
International Migration with Heterogeneous Agents: Theory and Evidence
DEFF Research Database (Denmark)
Schröder, Philipp J.H.; Brücker, Herbert
of German migration stocks and flows since 1967. We find that (i) panel-unit root tests reject the hypothesis that migration flows and the explanatory variables are integrated of the same order, while migration stocks and the explanatory variables are all I(1) variables, and (ii) the hypothesis......Temporary migration, though empirically relevant, is often ignored in formal models. This paper proposes a migration model with heterogeneous agents and persistent cross country income differentials that features temporary migration. In equilibrium there exists a positive relation between the stock...... of migrants and the income differential, while the net migration flow becomes zero. Consequently, existing empirical migration models, estimating net migration flows, instead of stocks, may be misspecified. This suspicion appears to be confirmed by our investigation of the cointegration relationships...
Implications of Neuroscientific Evidence for the Cognitive Models of Post-Traumatic Stress Disorder
Cruwys, Tegan; O'Kearney, Richard
2008-01-01
Brewin's dual representation theory, Ehlers and Clark's cognitive appraisal model, and Dalgleish's schematic, propositional, analogue and associative representational systems model are considered in the light of recent evidence on the neural substrates of post-traumatic stress disorder (PTSD). The models' proposals about the cognitive mechanism of…
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....
Delagran, Louise; Vihstadt, Corrie; Evans, Roni
2015-09-01
Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development.
Mobile Money, Trade Credit and Economic Development : Theory and Evidence
Beck, T.H.L.; Pamuk, H.; Uras, R.B.; Ramrattan, R.
Using a novel enterprise survey from Kenya (FinAccess Business), we document a strong positive association between the use of mobile money as a method to pay suppliers and access to trade credit. We develop a dynamic general equilibrium model with heterogeneous entrepreneurs, imperfect credit
Mobile Money, Trade Credit and Economic Development : Theory and Evidence
Beck, T.H.L.; Pamuk, H.; Uras, R.B.; Ramrattan, R.
2015-01-01
Using a novel enterprise survey from Kenya (FinAccess Business), we document a strong positive association between the use of mobile money as a method to pay suppliers and access to trade credit. We develop a dynamic general equilibrium model with heterogeneous entrepreneurs, imperfect credit
Political autonomy and independence: theory and experimental evidence
Abbink, K.; Brandts, J.
2007-01-01
We study the process by which subordinated regions of a country can obtain a more favourable political status. In our theoretical model a dominant and a dominated region first interact through a voting process that can lead to different degrees of autonomy. If this process fails then both regions en
Partner choice and cooperation in networks: Theory and experimental evidence
Ule, A.
2008-01-01
Cooperation is beneficial but may be hard to achieve in situations where the selfish interests of individuals conflict with their common goal, such as in sharing of goods, help, knowledge or information, in trade and pollution negotiations, and in exploitation of common resources. The standard model
Mobile Money, Trade Credit and Economic Development : Theory and Evidence
Beck, T.H.L.; Pamuk, H.; Uras, R.B.; Ramrattan, R.
2015-01-01
Using a novel enterprise survey from Kenya (FinAccess Business), we document a strong positive association between the use of mobile money as a method to pay suppliers and access to trade credit. We develop a dynamic general equilibrium model with heterogeneous entrepreneurs, imperfect credit market
Queuing theory models for computer networks
Galant, David C.
1989-01-01
A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.
Social capital: theory, evidence, and implications for oral health.
Rouxel, Patrick L; Heilmann, Anja; Aida, Jun; Tsakos, Georgios; Watt, Richard G
2015-04-01
In the last two decades, there has been increasing application of the concept of social capital in various fields of public health, including oral health. However, social capital is a contested concept with debates on its definition, measurement, and application. This study provides an overview of the concept of social capital, highlights the various pathways linking social capital to health, and discusses the potential implication of this concept for health policy. An extensive and diverse international literature has examined the relationship between social capital and a range of general health outcomes across the life course. A more limited but expanding literature has also demonstrated the potential influence of social capital on oral health. Much of the evidence in relation to oral health is limited by methodological shortcomings mainly related to the measurement of social capital, cross-sectional study designs, and inadequate controls for confounding factors. Further research using stronger methodological designs should explore the role of social capital in oral health and assess its potential application in the development of oral health improvement interventions.
Singing for respiratory health: theory, evidence and challenges.
Gick, Mary L; Nicol, Jennifer J
2016-09-01
The premise that singing is a health promoting activity for people with respiratory conditions of chronic obstructive pulmonary disease (COPD) and asthma is a growing area of interest being investigated by researchers from various disciplines. The preliminary evidence, a theoretical framework and identification of methodological challenges are discussed in this perspective article with an eye to recommendations for further research to advance knowledge. After a brief summary of main research findings on singing in healthy people to provide background context, research is reviewed on singing in people with COPD and asthma. Studies include published research and as yet unpublished work by the authors. Methodological challenges arising from the reviewed studies are identified such as attrition from singing or control groups based on weak and strong, respectively, beliefs about singing's effectiveness. Potential solutions for these problems are considered with further recommendations made for other singing research. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Visceral obesity and psychosocial stress: a generalised control theory model
Wallace, Rodrick
2016-07-01
The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.
Economic contract theory tests models of mutualism.
Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E
2010-09-01
Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.
Intangible Capital and Corporate Cash Holdings: Theory and Evidence
Dalida Kadyrzhanova; Antonio Falato; Jae Sim
2012-01-01
The rise in intangible capital is a fundamental driver of the secular trend in US corporate cash holdings over the last decades. We construct a new measure of intangible capital and show that intangible capital is the most important firm-level determinant of corporate cash holdings. Our measure accounts for almost as much of the secular increase in cash since the 1980s as all other standard determinants together. We then develop a new model of corporate cash holdings that introduces intangibl...
Band, Rebecca; Bradbury, Katherine; Morton, Katherine; May, Carl; Michie, Susan; Mair, Frances S; Murray, Elizabeth; McManus, Richard J; Little, Paul; Yardley, Lucy
2017-02-23
This paper describes the intervention planning process for the Home and Online Management and Evaluation of Blood Pressure (HOME BP), a digital intervention to promote hypertension self-management. It illustrates how a Person-Based Approach can be integrated with theory- and evidence-based approaches. The Person-Based Approach to intervention development emphasises the use of qualitative research to ensure that the intervention is acceptable, persuasive, engaging and easy to implement. Our intervention planning process comprised two parallel, integrated work streams, which combined theory-, evidence- and person-based elements. The first work stream involved collating evidence from a mixed methods feasibility study, a systematic review and a synthesis of qualitative research. This evidence was analysed to identify likely barriers and facilitators to uptake and implementation as well as design features that should be incorporated in the HOME BP intervention. The second work stream used three complementary approaches to theoretical modelling: developing brief guiding principles for intervention design, causal modelling to map behaviour change techniques in the intervention onto the Behaviour Change Wheel and Normalisation Process Theory frameworks, and developing a logic model. The different elements of our integrated approach to intervention planning yielded important, complementary insights into how to design the intervention to maximise acceptability and ease of implementation by both patients and health professionals. From the primary and secondary evidence, we identified key barriers to overcome (such as patient and health professional concerns about side effects of escalating medication) and effective intervention ingredients (such as providing in-person support for making healthy behaviour changes). Our guiding principles highlighted unique design features that could address these issues (such as online reassurance and procedures for managing concerns). Causal
A matrix model from string field theory
Directory of Open Access Journals (Sweden)
Syoji Zeze
2016-09-01
Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.
Polarimetric clutter modeling: Theory and application
Kong, J. A.; Lin, F. C.; Borgeaud, M.; Yueh, H. A.; Swartz, A. A.; Lim, H. H.; Shim, R. T.; Novak, L. M.
1988-01-01
The two-layer anisotropic random medium model is used to investigate fully polarimetric scattering properties of earth terrain media. The polarization covariance matrices for the untilted and tilted uniaxial random medium are evaluated using the strong fluctuation theory and distorted Born approximation. In order to account for the azimuthal randomness in the growth direction of leaves in tree and grass fields, an averaging scheme over the azimuthal direction is also applied. It is found that characteristics of terrain clutter can be identified through the analysis of each element of the covariance matrix. Theoretical results are illustrated by the comparison with experimental data provided by MIT Lincoln Laboratory for tree and grass fields.
A matrix model from string field theory
Zeze, Syoji
2016-09-01
We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N) vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large N matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.
Elson, Edward
2009-01-01
A theory of control of cellular proliferation and differentiation in the early development of metazoan systems, postulating a system of electrical controls "parallel" to the processes of molecular biochemistry, is presented. It is argued that the processes of molecular biochemistry alone cannot explain how a developing organism defies a stochastic universe. The demonstration of current flow (charge transfer) along the long axis of DNA through the base-pairs (the "pi-way) in vitro raises the question of whether nature may employ such current flows for biological purposes. Such currents might be too small to be accessible to direct measurement in vivo but conduction has been measured in vitro, and the methods might well be extended to living systems. This has not been done because there is no reasonable model which could stimulate experimentation. We suggest several related, but detachable or independent, models for the biological utility of charge transfer, whose scope admittedly outruns current concepts of thinking about organization, growth, and development in eukaryotic, metazoan systems. The ideas are related to explanations proposed to explain the effects demonstrated on tumors and normal tissues described in Article I (this issue). Microscopic and mesoscopic potential fields and currents are well known at sub-cellular, cellular, and organ systems levels. Not only are such phenomena associated with internal cellular membranes in bioenergetics and information flow, but remarkable long-range fields over tissue interfaces and organs appear to play a role in embryonic development (Nuccitelli, 1992 ). The origin of the fields remains unclear and is the subject of active investigation. We are proposing that similar processes could play a vital role at a "sub-microscopic level," at the level of the chromosomes themselves, and could play a role in organizing and directing fundamental processes of growth and development, in parallel with the more discernible fields and
Pharmacological management of anticholinergic delirium - theory, evidence and practice.
Dawson, Andrew H; Buckley, Nicholas A
2016-03-01
The spectrum of anticholinergic delirium is a common complication following drug overdose. Patients with severe toxicity can have significant distress and behavioural problems that often require pharmacological management. Cholinesterase inhibitors, such as physostigmine, are effective but widespread use has been limited by concerns about safety, optimal dosing and variable supply. Case series support efficacy in reversal of anticholinergic delirium. However doses vary widely and higher doses commonly lead to cholinergic toxicity. Seizures are reported in up to 2.5% of patients and occasional cardiotoxic effects are also recorded. This article reviews the serendipitous path whereby physostigmine evolved into the preferred anticholinesterase antidote largely without any research to indicate the optimal dosing strategy. Adverse events observed in case series should be considered in the context of pharmacokinetic/pharmacodynamic studies of physostigmine which suggest a much longer latency before the maximal increase in brain acetylcholine than had been previously assumed. This would favour protocols that use lower doses and longer re-dosing intervals. We propose based on the evidence reviewed that the use of cholinesterase inhibitors should be considered in anticholinergic delirium that has not responded to non-pharmacological delirium management. The optimal risk/benefit would be with a titrated dose of 0.5 to 1 mg physostigmine (0.01-0.02 mg kg(-1) in children) with a minimum delay of 10-15 min before re-dosing. Slower onset and longer acting agents such as rivastigmine would also be logical but more research is needed to guide the appropriate dose in this setting.
Quantum Model Theory (QMod): Modeling Contextual Emergent Entangled Interfering Entities
Aerts, Diederik
2012-01-01
In this paper we present 'Quantum Model Theory' (QMod), a theory we developed to model entities that entail the typical quantum effects of 'contextuality, 'superposition', 'interference', 'entanglement' and 'emergence'. This aim of QMod is to put forward a theoretical framework that has the technical power of standard quantum mechanics, namely it makes explicitly use of the standard complex Hilbert space and its quantum mechanical calculus, but is also more general than standard quantum mechanics, in the sense that it only uses this quantum calculus locally, i.e. for each context corresponding to a measurement. In this sense, QMod is a generalization of quantum mechanics, similar to how the general relativity manifold mathematical formalism is a generalization of special relativity and classical physics. We prove by means of a representation theorem that QMod can be used for any entity entailing the typical quantum effects mentioned above. Some examples of application of QMod in concept theory and macroscopic...
Estimating Model Evidence Using Data Assimilation
Carrassi, Alberto; Bocquet, Marc; Hannart, Alexis; Ghil, Michael
2017-04-01
We review the field of data assimilation (DA) from a Bayesian perspective and show that, in addition to its by now common application to state estimation, DA may be used for model selection. An important special case of the latter is the discrimination between a factual model - which corresponds, to the best of the modeller's knowledge, to the situation in the actual world in which a sequence of events has occurred-and a counterfactual model, in which a particular forcing or process might be absent or just quantitatively different from the actual world. Three different ensemble-DA methods are reviewed for this purpose: the ensemble Kalman filter (EnKF), the ensemble four-dimensional variational smoother (En-4D-Var), and the iterative ensemble Kalman smoother (IEnKS). An original contextual formulation of model evidence (CME) is introduced. It is shown how to apply these three methods to compute CME, using the approximated time-dependent probability distribution functions (pdfs) each of them provide in the process of state estimation. The theoretical formulae so derived are applied to two simplified nonlinear and chaotic models: (i) the Lorenz three-variable convection model (L63), and (ii) the Lorenz 40- variable midlatitude atmospheric dynamics model (L95). The numerical results of these three DA-based methods and those of an integration based on importance sampling are compared. It is found that better CME estimates are obtained by using DA, and the IEnKS method appears to be best among the DA methods. Differences among the performance of the three DA-based methods are discussed as a function of model properties. Finally, the methodology is implemented for parameter estimation and for event attribution.
Moral Distress Model Reconstructed Using Grounded Theory.
Ko, Hsun-Kuei; Chin, Chi-Chun; Hsu, Min-Tao
2016-12-29
The problems of nurse burnout and manpower shortage relate to moral distress. Thus, having a good understanding of moral distress is critical to developing strategies that effectively improve the clinical ethical climate and improve nursing retention in Taiwan. The aim of this study was to reconstruct the model of moral distress using the grounded theory. Twenty-five staff nurses at work units who attend to the needs of adult, pediatric, acute, and critical disease or end-of-life-care patients were recruited as participants using theoretical sampling from three teaching hospitals in Taiwan. Data were collected using intensive, 2- to 3-hour interviews with each participant. Audio recordings of the interviews were made and then converted into transcripts. The data were analyzed using the grounded theory. In the clinical setting, the perspective that nurses take toward clinical moral events reflects their moral values, which trigger moral cognition, provocation, and appraisal. The moral barriers that form when moral events that occur in clinical settings contradict personal moral values may later develop into moral distress. In handling moral barriers in the clinical environment, nurses make moral judgments and determine what is morally correct. Influenced by moral efficacy, the consequence may either be a moral action or an expression of personal emotion. Wasting National Health Insurance resources and Chinese culture are key sources of moral distress for nurses in Taiwan. The role of self-confidence in promoting moral efficacy and the role of heterodox skills in promoting moral actions represent findings that are unique to this study. The moral distress model was used in this study to facilitate the development of future nursing theories. On the basis of our findings, we suggested that nursing students be encouraged to use case studies to establish proper moral values, improve moral cognition and judgment capabilities, and promote moral actions to better handle the
The Economic Importance of Financial Literacy: Theory and Evidence
Lusardi, Annamaria; Mitchell, Olivia S.
2017-01-01
This paper undertakes an assessment of a rapidly growing body of economic research on financial literacy. We start with an overview of theoretical research which casts financial knowledge as a form of investment in human capital. Endogenizing financial knowledge has important implications for welfare as well as policies intended to enhance levels of financial knowledge in the larger population. Next, we draw on recent surveys to establish how much (or how little) people know and identify the least financially savvy population subgroups. This is followed by an examination of the impact of financial literacy on economic decision-making in the United States and elsewhere. While the literature is still young, conclusions may be drawn about the effects and consequences of financial illiteracy and what works to remedy these gaps. A final section offers thoughts on what remains to be learned if researchers are to better inform theoretical and empirical models as well as public policy. PMID:28579637
Application of Chaos Theory to Psychological Models
Blackerby, Rae Fortunato
This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in
An Inflationary Model in String Theory
Iizuka, N; Iizuka, Norihiro; Trivedi, Sandip P.
2004-01-01
We construct a model of inflation in string theory after carefully taking into account moduli stabilization. The setting is a warped compactification of Type IIB string theory in the presence of D3 and anti-D3-branes. The inflaton is the position of a D3-brane in the internal space. By suitably adjusting fluxes and the location of symmetrically placed anti-D3-branes, we show that at a point of enhanced symmetry, the inflaton potential V can have a broad maximum, satisfying the condition V''/V << 1 in Planck units. On starting close to the top of this potential the slow-roll conditions can be met. Observational constraints impose significant restrictions. As a first pass we show that these can be satisfied and determine the important scales in the compactification to within an order of magnitude. One robust feature is that the scale of inflation is low, H = O(10^{10}) GeV. Removing the observational constraints makes it much easier to construct a slow-roll inflationary model. Generalizations and conseque...
Modified perturbation theory for the Yukawa model
Poluektov, Yu M
2016-01-01
A new formulation of perturbation theory for a description of the Dirac and scalar fields (the Yukawa model) is suggested. As the main approximation the self-consistent field model is chosen, which allows in a certain degree to account for the effects caused by the interaction of fields. Such choice of the main approximation leads to a normally ordered form of the interaction Hamiltonian. Generation of the fermion mass due to the interaction with exchange of the scalar boson is investigated. It is demonstrated that, for zero bare mass, the fermion can acquire mass only if the coupling constant exceeds the critical value determined by the boson mass. In this connection, the problem of the neutrino mass is discussed.
PARFUME Theory and Model basis Report
Energy Technology Data Exchange (ETDEWEB)
Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson
2009-09-01
The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.
Stochastic linear programming models, theory, and computation
Kall, Peter
2011-01-01
This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...
Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang
2017-08-28
Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.
Côté, Françoise; Gagnon, Johanne; Houme, Philippe Kouffé; Abdeljelil, Anis Ben; Gagnon, Marie-Pierre
2012-10-01
Using an extended theory of planned behaviour, this article is a report of a study to identify the factors that influence nurses' intention to integrate research evidence into their clinical decision-making. Health professionals are increasingly asked to adopt evidence-based practice. The integration of research evidence in nurses' clinical decision-making would have an important impact on the quality of care provided for patients. Despite evidence supporting this practice and the availability of high quality research in the field of nursing, the gap between research and practice is still present. A predictive correlational study. A total of 336 nurses working in a university hospital participated in this research. Data were collected in February and March 2008 by means of a questionnaire based on an extension of the theory of planned behaviour. Descriptive statistics of the model variables, Pearson correlations between all the variables and multiple linear regression analysis were performed. Nurses' intention to integrate research findings into clinical decision-making can be predicted by moral norm, normative beliefs, perceived behavioural control and past behaviour. The moral norm is the most important predictor. Overall, the final model explains 70% of the variance in nurses' intention. The present study supports the use of an extended psychosocial theory for identifying the determinants of nurses' intention to integrate research evidence into their clinical decision-making. Interventions that focus on increasing nurses' perceptions that using research is their responsibility for ensuring good patient care and providing a supportive environment could promote an evidence-based nursing practice. © 2012 Blackwell Publishing Ltd.
Decision-Making Algorithm for Multisensor Fusion Based on Grey Relation and DS Evidence Theory
Directory of Open Access Journals (Sweden)
Fang Ye
2016-01-01
Full Text Available Decision-making algorithm, as the key technology for uncertain data fusion, is the core to obtain reasonable multisensor information fusion results. DS evidence theory is a typical and widely applicable decision-making method. However, DS evidence theory makes decisions without considering the sensors’ difference, which may lead to illogical results. In this paper, we present a novel decision-making algorithm for uncertain fusion based on grey relation and DS evidence theory. The proposed algorithm comprehensively takes consideration of sensor’s credibility and evidence’s overall discriminability, which can solve the uncertainty problems caused by inconsistence of sensors themselves and complexity of monitoring environment and simultaneously ensure the validity and accuracy of fusion results. The innovative decision-making algorithm firstly obtains the sensor’s credibility through the introduction of grey relation theory and then defines two impact factors as sensor’s credibility and evidence’s overall discriminability according to the focal element analyses and evidence’s distance analysis, respectively; after that, it uses the impact factors to modify the evidences and finally gets more reasonable and effective results through DS combination rule. Simulation results and analyses demonstrate that the proposed algorithm can overcome the trouble caused by large evidence conflict and one-vote veto, which indicates that it can improve the ability of target judgment and enhance precision of uncertain data fusion. Thus the novel decision-making method has a certain application value.
Theory and modeling of electron fishbones
Vlad, G.; Fusco, V.; Briguglio, S.; Fogaccia, G.; Zonca, F.; Wang, X.
2016-10-01
Internal kink instabilities exhibiting fishbone like behavior have been observed in a variety of experiments where a high energy electron population, generated by strong auxiliary heating and/or current drive systems, was present. After briefly reviewing the experimental evidences of energetic electrons driven fishbones, and the main results of linear and nonlinear theory of electron fishbones, the results of global, self-consistent, nonlinear hybrid MHD-Gyrokinetic simulations will be presented. To this purpose, the extended/hybrid MHD-Gyrokinetic code XHMGC will be used. Linear dynamics analysis will enlighten the effect of considering kinetic thermal ion compressibility and diamagnetic response, and kinetic thermal electrons compressibility, in addition to the energetic electron contribution. Nonlinear saturation and energetic electron transport will also be addressed, making extensive use of Hamiltonian mapping techniques, discussing both centrally peaked and off-axis peaked energetic electron profiles. It will be shown that centrally peaked energetic electron profiles are characterized by resonant excitation and nonlinear response of deeply trapped energetic electrons. On the other side, off-axis peaked energetic electron profiles are characterized by resonant excitation and nonlinear response of barely circulating energetic electrons which experience toroidal precession reversal of their motion.
Improved information fusion approach based on D-S evidence theory
Energy Technology Data Exchange (ETDEWEB)
Sun, Rui; Huang, Hong Zhong; Miao, Qiang [University of Electronic Science and Technology of China, Chengdu (China)
2008-12-15
Conventional D-S evidence theory has an unavoidable disadvantage in that it will give counter-intuitive result when fusing high conflict information. This paper proposes an improved method to solve this problem. By reassigning weight factors before fusing, the method can give reasonable results especially when the initial weight factors of conflict evidences are almost equal. It gives an adjustable factor to adjust the reassigning force. An example is given to illustrate these advantages
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2015-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Theory and modelling of nanocarbon phase stability.
Energy Technology Data Exchange (ETDEWEB)
Barnard, A. S.
2006-01-01
The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.
Galaxy alignments: Theory, modelling and simulations
Kiessling, Alina; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L; Rassat, Anais
2015-01-01
The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in large-scale structure tend to align the shapes and angular momenta of nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both $N$-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the ...
Modeling missing data in knowledge space theory.
de Chiusole, Debora; Stefanutti, Luca; Anselmi, Pasquale; Robusto, Egidio
2015-12-01
Missing data are a well known issue in statistical inference, because some responses may be missing, even when data are collected carefully. The problem that arises in these cases is how to deal with missing data. In this article, the missingness is analyzed in knowledge space theory, and in particular when the basic local independence model (BLIM) is applied to the data. Two extensions of the BLIM to missing data are proposed: The former, called ignorable missing BLIM (IMBLIM), assumes that missing data are missing completely at random; the latter, called missing BLIM (MissBLIM), introduces specific dependencies of the missing data on the knowledge states, thus assuming that the missing data are missing not at random. The IMBLIM and the MissBLIM modeled the missingness in a satisfactory way, in both a simulation study and an empirical application, depending on the process that generates the missingness: If the missing data-generating process is of type missing completely at random, then either IMBLIM or MissBLIM provide adequate fit to the data. However, if the pattern of missingness is functionally dependent upon unobservable features of the data (e.g., missing answers are more likely to be wrong), then only a correctly specified model of the missingness distribution provides an adequate fit to the data.
A Mathematical Theory of the Gauged Linear Sigma Model
Fan, Huijun; Ruan, Yongbin
2015-01-01
We construct a rigorous mathematical theory of Witten's Gauged Linear Sigma Model (GLSM). Our theory applies to a wide range of examples, including many cases with non-Abelian gauge group. Both the Gromov-Witten theory of a Calabi-Yau complete intersection X and the Landau-Ginzburg dual (FJRW-theory) of X can be expressed as gauged linear sigma models. Furthermore, the Landau-Ginzburg/Calabi-Yau correspondence can be interpreted as a variation of the moment map or a deformation of GIT in the GLSM. This paper focuses primarily on the algebraic theory, while a companion article will treat the analytic theory.
The Properties of Model Selection when Retaining Theory Variables
DEFF Research Database (Denmark)
Hendry, David F.; Johansen, Søren
Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....
Gravothermal Star Clusters - Theory and Computer Modelling
Spurzem, Rainer
2010-11-01
In the George Darwin lecture, delivered to the British Royal Astronomical Society in 1960 by Viktor A. Ambartsumian he wrote on the evolution of stellar systems that it can be described by the "dynamic evolution of a gravitating gas" complemented by "a statistical description of the changes in the physical states of stars". This talk will show how this physical concept has inspired theoretical modeling of star clusters in the following decades up to the present day. The application of principles of thermodynamics shows, as Ambartsumian argued in his 1960 lecture, that there is no stable state of equilibrium of a gravitating star cluster. The trend to local thermodynamic equilibrium is always disturbed by escaping stars (Ambartsumian), as well as by gravothermal and gravogyro instabilities, as it was detected later. Here the state-of-the-art of modeling the evolution of dense stellar systems based on principles of thermodynamics and statistical mechanics (Fokker-Planck approximation) will be reviewed. Recent progress including rotation and internal correlations (primordial binaries) is presented. The models have also very successfully been used to study dense star clusters around massive black holes in galactic nuclei and even (in a few cases) relativistic supermassive dense objects in centres of galaxies (here again briefly touching one of the many research fields of V.A. Ambartsumian). For the modern present time of high-speed supercomputing, where we are tackling direct N-body simulations of star clusters, we will show that such direct modeling supports and proves the concept of the statistical models based on the Fokker-Planck theory, and that both theoretical concepts and direct computer simulations are necessary to support each other and make scientific progress in the study of star cluster evolution.
A Realizability Model for Impredicative Hoare Type Theory
DEFF Research Database (Denmark)
Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar
2008-01-01
We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....
Venture Theory: A Model of Decision Weights.
1988-01-01
restrictions are important in that nonadditive decision weights can be used to "explain" many anomalies of standard choice theory . Implications. There are...1974). On utility functions. Theory and Decision, 5, 205-242. Chew, S. H., & MacCrimmon, K. R. Alpha-nu choice theory : A generalization of expected
Directory of Open Access Journals (Sweden)
Kaner Eileen FS
2008-01-01
Full Text Available Abstract Background Psychological theories of behaviour may provide a framework to guide the design of interventions to change professional behaviour. Behaviour change interventions, designed using psychological theory and targeting important motivational beliefs, were experimentally evaluated for effects on the behavioural intention and simulated behaviour of GPs in the management of uncomplicated upper respiratory tract infection (URTI. Methods The design was a 2 × 2 factorial randomised controlled trial. A postal questionnaire was developed based on three theories of human behaviour: Theory of Planned Behaviour; Social Cognitive Theory and Operant Learning Theory. The beliefs and attitudes of GPs regarding the management of URTI without antibiotics and rates of prescribing on eight patient scenarios were measured at baseline and post-intervention. Two theory-based interventions, a "graded task" with "action planning" and a "persuasive communication", were incorporated into the post-intervention questionnaire. Trial groups were compared using co-variate analyses. Results Post-intervention questionnaires were returned for 340/397 (86% GPs who responded to the baseline survey. Each intervention had a significant effect on its targeted behavioural belief: compared to those not receiving the intervention GPs completing Intervention 1 reported stronger self-efficacy scores (Beta = 1.41, 95% CI: 0.64 to 2.25 and GPs completing Intervention 2 had more positive anticipated consequences scores (Beta = 0.98, 95% CI = 0.46 to 1.98. Intervention 2 had a significant effect on intention (Beta = 0.90, 95% CI = 0.41 to 1.38 and simulated behaviour (Beta = 0.47, 95% CI = 0.19 to 0.74. Conclusion GPs' intended management of URTI was significantly influenced by their confidence in their ability to manage URTI without antibiotics and the consequences they anticipated as a result of doing so. Two targeted behaviour change interventions differentially affected
Hrisos, Susan; Eccles, Martin; Johnston, Marie; Francis, Jill; Kaner, Eileen FS; Steen, Nick; Grimshaw, Jeremy
2008-01-01
Background Psychological theories of behaviour may provide a framework to guide the design of interventions to change professional behaviour. Behaviour change interventions, designed using psychological theory and targeting important motivational beliefs, were experimentally evaluated for effects on the behavioural intention and simulated behaviour of GPs in the management of uncomplicated upper respiratory tract infection (URTI). Methods The design was a 2 × 2 factorial randomised controlled trial. A postal questionnaire was developed based on three theories of human behaviour: Theory of Planned Behaviour; Social Cognitive Theory and Operant Learning Theory. The beliefs and attitudes of GPs regarding the management of URTI without antibiotics and rates of prescribing on eight patient scenarios were measured at baseline and post-intervention. Two theory-based interventions, a "graded task" with "action planning" and a "persuasive communication", were incorporated into the post-intervention questionnaire. Trial groups were compared using co-variate analyses. Results Post-intervention questionnaires were returned for 340/397 (86%) GPs who responded to the baseline survey. Each intervention had a significant effect on its targeted behavioural belief: compared to those not receiving the intervention GPs completing Intervention 1 reported stronger self-efficacy scores (Beta = 1.41, 95% CI: 0.64 to 2.25) and GPs completing Intervention 2 had more positive anticipated consequences scores (Beta = 0.98, 95% CI = 0.46 to 1.98). Intervention 2 had a significant effect on intention (Beta = 0.90, 95% CI = 0.41 to 1.38) and simulated behaviour (Beta = 0.47, 95% CI = 0.19 to 0.74). Conclusion GPs' intended management of URTI was significantly influenced by their confidence in their ability to manage URTI without antibiotics and the consequences they anticipated as a result of doing so. Two targeted behaviour change interventions differentially affected these beliefs. One
Dependence Assessment in Human Reliability Analysis Using Evidence Theory and AHP.
Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong
2015-07-01
Dependence assessment among human errors in human reliability analysis (HRA) is an important issue. Many of the dependence assessment methods in HRA rely heavily on the expert's opinion, thus are subjective and may sometimes cause inconsistency. In this article, we propose a computational model based on the Dempster-Shafer evidence theory (DSET) and the analytic hierarchy process (AHP) method to handle dependence in HRA. First, dependence influencing factors among human tasks are identified and the weights of the factors are determined by experts using the AHP method. Second, judgment on each factor is given by the analyst referring to anchors and linguistic labels. Third, the judgments are represented as basic belief assignments (BBAs) and are integrated into a fused BBA by weighted average combination in DSET. Finally, the CHEP is calculated based on the fused BBA. The proposed model can deal with ambiguity and the degree of confidence in the judgments, and is able to reduce the subjectivity and improve the consistency in the evaluation process.
On torsion-free vacuum solutions of the model of de Sitter gauge theory of gravity
Institute of Scientific and Technical Information of China (English)
2008-01-01
It is shown that all vacuum solutions of Einstein field equation with a positive cosmological constant are the solutions of a model of dS gauge theory of gravity.Therefore,the model is expected to pass the observational tests on the scale of solar systems and explain the indirect evidence of gravitational wave from the binary pulsars PSR1913+16.
A Biblical-Theological Model of Cognitive Dissonance Theory: Relevance for Christian Educators
Bowen, Danny Ray
2012-01-01
The purpose of this content analysis research was to develop a biblical-theological model of Cognitive Dissonance Theory applicable to pedagogy. Evidence of cognitive dissonance found in Scripture was used to infer a purpose for the innate drive toward consonance. This inferred purpose was incorporated into a model that improves the descriptive…
A Biblical-Theological Model of Cognitive Dissonance Theory: Relevance for Christian Educators
Bowen, Danny Ray
2012-01-01
The purpose of this content analysis research was to develop a biblical-theological model of Cognitive Dissonance Theory applicable to pedagogy. Evidence of cognitive dissonance found in Scripture was used to infer a purpose for the innate drive toward consonance. This inferred purpose was incorporated into a model that improves the descriptive…
Model of Polyakov duality: String field theory Hamiltonians from Yang-Mills theories
Periwal, Vipul
2000-08-01
Polyakov has conjectured that Yang-Mills theory should be equivalent to a noncritical string theory. I point out, based on the work of Marchesini, Ishibashi, Kawai and collaborators, and Jevicki and Rodrigues, that the loop operator of the Yang-Mills theory is the temporal gauge string field theory Hamiltonian of a noncritical string theory. The consistency condition of the string interpretation is the zig-zag symmetry emphasized by Polyakov. I explicitly show how this works for the one-plaquette model, providing a consistent direct string interpretation of the unitary matrix model for the first time.
Evidence for discrete chiral symmetry breaking in $N = 1$ supersymmetric Yang-Mills theory
Kirchner, R; Montvay, István; Spanderen, K; Westphalen, J
1999-01-01
In a numerical Monte Carlo simulation of SU(2) Yang-Mills theory with dynamical gauginos we find evidence for two degenerate ground states at the supersymmetry point corresponding to zero gaugino mass. This is consistent with the expected pattern of spontaneous discrete chiral symmetry breaking $Z_4 \\to Z_2$ caused by gaugino condensation.
Evidence for discrete chiral symmetry breaking in N=1 supersymmetric Yang-Mills theory
Desy-Münster Collaboration; Kirchner, R.; Montvay, I.; Westphalen, J.; Luckmann, S.; Spanderen, K.
1999-01-01
In a numerical Monte Carlo simulation of SU(2) Yang-Mills theory with dynamical gauginos we find evidence for two degenerate ground states at the supersymmetry point corresponding to zero gaugino mass. This is consistent with the expected pattern of spontaneous discrete chiral symmetry breaking Z4-->Z2 caused by gaugino condensation.
Parenthood and Happiness: A Review of Folk Theories versus Empirical Evidence
Hansen, Thomas
2012-01-01
This paper reviews and compares folk theories and empirical evidence about the influence of parenthood on happiness and life satisfaction. The review of attitudes toward parenthood and childlessness reveals that people tend to believe that parenthood is central to a meaningful and fulfilling life, and that the lives of childless people are…
Density functional theory and multiscale materials modeling
Indian Academy of Sciences (India)
Swapan K Ghosh
2003-01-01
One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids. In the intermediate mesoscopic length scale, an appropriate picture of the equilibrium and dynamical processes has been obtained through the single particle number density of the constituent atoms or molecules. A wide class of problems involving nanomaterials, interfacial science and soft condensed matter has been addressed using the density based theoretical formalism as well as atomistic simulation in this regime. In the macroscopic length scale, however, matter is usually treated as a continuous medium and a description using local mass density, energy density and other related density functions has been found to be quite appropriate. A unique single unified theoretical framework that emerges through the density concept at these diverse length scales and is applicable to both quantum and classical systems is the so called density functional theory (DFT) which essentially provides a vehicle to project the many-particle picture to a single particle one. Thus, the central equation for quantum DFT is a one-particle Schrödinger-like Kohn–Sham equation, while the same for classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential. Selected illustrative applications of quantum DFT to microscopic modeling of intermolecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are presented.
Models of Particle Physics from Type IIB String Theory and F-theory: A Review
Maharana, Anshuman
2012-01-01
We review particle physics model building in type IIB string theory and F-theory. This is a region in the landscape where in principle many of the key ingredients required for a realistic model of particle physics can be combined successfully. We begin by reviewing moduli stabilisation within this framework and its implications for supersymmetry breaking. We then review model building tools and developments in the weakly coupled type IIB limit, for both local D3-branes at singularities and global models of intersecting D7-branes. Much of recent model building work has been in the strongly coupled regime of F-theory due to the presence of exceptional symmetries which allow for the construction of phenomenologically appealing Grand Unified Theories. We review both local and global F-theory model building starting from the fundamental concepts and tools regarding how the gauge group, matter sector and operators arise, and ranging to detailed phenomenological properties explored in the literature.
String-like dual models for scalar theories
Baadsgaard, Christian; Bjerrum-Bohr, N. E. J.; Bourjaily, Jacob; Damgaard, Poul H.
2016-12-01
We show that all tree-level amplitudes in φ p scalar field theory can be represented as the α ' → 0 limit of an SL(2, ℝ)-invariant, string-theory-like dual model integral. These dual models are constructed according to constraints that admit families of solutions. We derive these dual models, and give closed formulae for all tree-level amplitudes of any φ p scalar field theory.
String-Like Dual Models for Scalar Theories
Baadsgaard, Christian; Bourjaily, Jacob L; Damgaard, Poul H
2016-01-01
We show that all tree-level amplitudes in $\\varphi^p$ scalar field theory can be represented as the $\\alpha'\\to0$ limit of an $SL(2,R)$-invariant, string-theory-like dual model integral. These dual models are constructed according to constraints that admit families of solutions. We derive these dual models, and give closed formulae for all tree-level amplitudes of any $\\varphi^p$ scalar field theory.
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
Chaos Theory as a Model for Managing Issues and Crises.
Murphy, Priscilla
1996-01-01
Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…
Conceptual change and preschoolers' theory of mind: evidence from load-force adaptation.
Sabbagh, Mark A; Hopkins, Sydney F R; Benson, Jeannette E; Flanagan, J Randall
2010-01-01
Prominent theories of preschoolers' theory of mind development have included a central role for changing or adapting existing conceptual structures in response to experiences. Because of the relatively protracted timetable of theory of mind development, it has been difficult to test this assumption about the role of adaptation directly. To gain evidence that cognitive adaptation is particularly important for theory of mind development, we sought to determine whether individual differences in cognitive adaptation in a non-social domain predicted preschoolers' theory of mind development. Twenty-five preschoolers were tested on batteries of theory of mind tasks, executive functioning tasks, and on their ability to adapt their lifting behavior to smoothly lift an unexpectedly heavy object. Results showed that children who adapted their lifting behavior more rapidly performed better on theory of mind tasks than those who adapted more slowly. These findings held up when age and performance on the executive functioning battery were statistically controlled. Although preliminary, we argue that this relation is attributable to individual differences in children's domain general abilities to efficiently change existing conceptual structures in response to experience.
The Theory of Finite Models without Equal Sign
Institute of Scientific and Technical Information of China (English)
Li Bo LUO
2006-01-01
In this paper, it is the first time ever to suggest that we study the model theory of all finite structures and to put the equal sign in the same situtation as the other relations. Using formulas of infinite lengths we obtain new theorems for the preservation of model extensions, submodels, model homomorphisms and inverse homomorphisms. These kinds of theorems were discussed in Chang and Keisler's Model Theory, systematically for general models, but Gurevich obtained some different theorems in this direction for finite models. In our paper the old theorems manage to survive in the finite model theory. There are some differences between into homomorphisms and onto homomorphisms in preservation theorems too. We also study reduced models and minimum models. The characterization sentence of a model is given, which derives a general result for any theory T to be equivalent to a set of existential-universal sentences. Some results about completeness and model completeness are also given.
Making sense of implementation theories, models and frameworks
National Research Council Canada - National Science Library
Nilsen, Per
2015-01-01
.... The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection...
Pluralistic and stochastic gene regulation: examples, models and consistent theory.
Salas, Elisa N; Shu, Jiang; Cserhati, Matyas F; Weeks, Donald P; Ladunga, Istvan
2016-06-01
We present a theory of pluralistic and stochastic gene regulation. To bridge the gap between empirical studies and mathematical models, we integrate pre-existing observations with our meta-analyses of the ENCODE ChIP-Seq experiments. Earlier evidence includes fluctuations in levels, location, activity, and binding of transcription factors, variable DNA motifs, and bursts in gene expression. Stochastic regulation is also indicated by frequently subdued effects of knockout mutants of regulators, their evolutionary losses/gains and massive rewiring of regulatory sites. We report wide-spread pluralistic regulation in ≈800 000 tightly co-expressed pairs of diverse human genes. Typically, half of ≈50 observed regulators bind to both genes reproducibly, twice more than in independently expressed gene pairs. We also examine the largest set of co-expressed genes, which code for cytoplasmic ribosomal proteins. Numerous regulatory complexes are highly significant enriched in ribosomal genes compared to highly expressed non-ribosomal genes. We could not find any DNA-associated, strict sense master regulator. Despite major fluctuations in transcription factor binding, our machine learning model accurately predicted transcript levels using binding sites of 20+ regulators. Our pluralistic and stochastic theory is consistent with partially random binding patterns, redundancy, stochastic regulator binding, burst-like expression, degeneracy of binding motifs and massive regulatory rewiring during evolution.
Theory and modeling of active brazing.
Energy Technology Data Exchange (ETDEWEB)
van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.
2013-09-01
Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.
Information Theory: a Multifaceted Model of Information
Directory of Open Access Journals (Sweden)
Mark Burgin
2003-06-01
Full Text Available A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by ShannonÃ¢Â€Â™s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases.
Hypergame Theory: A Model for Conflict, Misperception, and Deception
Directory of Open Access Journals (Sweden)
Nicholas S. Kovach
2015-01-01
Full Text Available When dealing with conflicts, game theory and decision theory can be used to model the interactions of the decision-makers. To date, game theory and decision theory have received considerable modeling focus, while hypergame theory has not. A metagame, known as a hypergame, occurs when one player does not know or fully understand all the strategies of a game. Hypergame theory extends the advantages of game theory by allowing a player to outmaneuver an opponent and obtaining a more preferred outcome with a higher utility. The ability to outmaneuver an opponent occurs in the hypergame because the different views (perception or deception of opponents are captured in the model, through the incorporation of information unknown to other players (misperception or intentional deception. The hypergame model more accurately provides solutions for complex theoretic modeling of conflicts than those modeled by game theory and excels where perception or information differences exist between players. This paper explores the current research in hypergame theory and presents a broad overview of the historical literature on hypergame theory.
Modeling Multisource-heterogeneous Information Based on Random Set and Fuzzy Set Theory
Institute of Scientific and Technical Information of China (English)
WEN Cheng-lin; XU Xiao-bin
2006-01-01
This paper presents a new idea, named as modeling multisensor-heterogeneous information, to incorporate the fuzzy logic methodologies with mulitsensor-multitarget system under the framework of random set theory. Firstly, based on strong random set and weak random set, the unified form to describe both data (unambiguous information) and fuzzy evidence (uncertain information) is introduced. Secondly, according to signatures of fuzzy evidence, two Bayesian-markov nonlinear measurement models are proposed to fuse effectively data and fuzzy evidence. Thirdly, by use of "the models-based signature-matching scheme", the operation of the statistics of fuzzy evidence defined as random set can be translated into that of the membership functions of relative point state variables. These works are the basis to construct qualitative measurement models and to fuse data and fuzzy evidence.
QUANTUM THEORY FOR THE BINOMIAL MODEL IN FINANCE THEORY
Institute of Scientific and Technical Information of China (English)
CHEN Zeqian
2004-01-01
In this paper, a quantum model for the binomial market in finance is proposed. We show that its risk-neutral world exhibits an intriguing structure as a disk in the unit ball of R3, whose radius is a function of the risk-free interest rate with two thresholds which prevent arbitrage opportunities from this quantum market. Furthermore, from the quantum mechanical point of view we re-deduce the Cox-Ross-Rubinstein binomial option pricing formula by considering Maxwell-Boltzmann statistics of the system of N distinguishable particles.
MULTI-FLEXIBLE SYSTEM DYNAMIC MODELING THEORY AND APPLICATION
Institute of Scientific and Technical Information of China (English)
仲昕; 周兵; 杨汝清
2001-01-01
The flexible body modeling theory was demonstrated. An example of modeling a kind of automobile's front suspension as a multi-flexible system was shown. Finally, it shows that the simulation results of multi-flexible dynamic model more approach the road test data than those of multi-rigid dynamic model do. Thus, it is fully testified that using multi-flexible body theory to model is necessary and effective.
The Standard Model is Natural as Magnetic Gauge Theory
DEFF Research Database (Denmark)
Sannino, Francesco
2011-01-01
matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...
Modeling self on others: An import theory of subjectivity and selfhood.
Prinz, Wolfgang
2017-03-01
This paper outlines an Import Theory of subjectivity and selfhood. Import theory claims that subjectivity is initially perceived as a key feature of other minds before it then becomes imported from other minds to own minds whereby it lays the ground for mental selfhood. Import theory builds on perception-production matching, which in turn draws on both representational mechanisms and social practices. Representational mechanisms rely on common coding of perception and production. Social practices rely on action mirroring in dyadic interactions. The interplay between mechanisms and practices gives rise to model self on others. Individuals become intentional agents in virtue of perceiving others mirroring themselves. The outline of the theory is preceded by an introductory section that locates import theory in the broader context of competing approaches, and it is followed by a concluding section that assesses import theory in terms of empirical evidence and explanatory power.
Scaling Theory and Modeling of DNA Evolution
Buldyrev, Sergey V.
1998-03-01
We present evidence supporting the possibility that the nucleotide sequence in noncoding DNA is power-law correlated. We do not find such long-range correlation in the coding regions of the gene, so we build a ``coding sequence finder'' to locate the coding regions of an unknown DNA sequence. We also propose a different coding sequence finding algorithm, based on the concept of mutual information(I. Große, S. V. Buldyrev, H. Herzel, H. E. Stanley, (preprint).). We describe our recent work on quantification of DNA patchiness, using long-range correlation measures (G. M. Viswanathan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Biophysical Journal 72), 866-875 (1997).. We also present our recent study of the simple repeat length distributions. We find that the distributions of some simple repeats in noncoding DNA have long power-law tails, while in coding DNA all simple repeat distributions decay exponentially. (N. V. Dokholyan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Phys. Rev. Lett (in press).) We discuss several models based on insertion-deletion and mutation-duplication mechanisms that relate long-range correlations in non-coding DNA to DNA evolution. Specifically, we relate long-range correlations in non-coding DNA to simple repeat expansion, and propose an evolutionary model that reproduces the power law distribution of simple repeat lengths. We argue that the absence of long-range correlations in protein coding sequences is related to their highly conserved primary structure which is necessary to insure protein folding.
Bender, Rachel E.; Alloy, Lauren B.
2011-01-01
Most life stress literature in bipolar disorder (BD) fails to account for the possibility of a changing relationship between psychosocial context and episode initiation across the course of the disorder. According to Post’s (1992) influential kindling hypothesis, major life stress is required to trigger initial onsets and recurrences of affective episodes, but successive episodes become progressively less tied to stressors and may eventually occur autonomously. Subsequent research on kindling has largely focused on unipolar depression (UD), and the model has been tested in imprecise and inconsistent ways. The aim of the present paper is to evaluate evidence for the kindling model as it applies to BD. We first outline the origins of the hypothesis, the evidence for the model in UD, and the issues needing further clarification. Next, we review the extant literature on the changing relationship between life stress and bipolar illness over time, and find that evidence from the methodologically strongest studies is inconsistent with the kindling hypothesis. We then integrate this existing body of research with two emerging biopsychosocial models of BD: the Behavioral Approach System dysregulation model, and the circadian and social rhythm theory. Finally, we present therapeutic implications and suggestions for future research. PMID:21334286
Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.
McPhail, Beverly A
2016-07-01
The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy. © The Author(s) 2015.
Summary of papers presented in the Theory and Modelling session
Lin-Liu Y.R.; Westerhof E.
2012-01-01
A total of 14 contributions were presented in the Theory and Modelling sessions at EC-17. One Theory and Modelling paper was included in the ITER ECRH and ECE sessions each. Three papers were in the area of nonlinear physics discussing parametric processes accompanying ECRH. Eight papers were based on the quasi-linear theory of wave heating and current drive. Three of these addressed the application of ECCD for NTM stabilization. Two papers considered scattering of EC waves by edge density fl...
The logical foundations of scientific theories languages, structures, and models
Krause, Decio
2016-01-01
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...
Solid modeling and applications rapid prototyping, CAD and CAE theory
Um, Dugan
2016-01-01
The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...
Matrix models, topological strings, and supersymmetric gauge theories
Dijkgraaf, Robbert; Vafa, Cumrun
2002-11-01
We show that B-model topological strings on local Calabi-Yau threefolds are large- N duals of matrix models, which in the planar limit naturally give rise to special geometry. These matrix models directly compute F-terms in an associated N=1 supersymmetric gauge theory, obtained by deforming N=2 theories by a superpotential term that can be directly identified with the potential of the matrix model. Moreover by tuning some of the parameters of the geometry in a double scaling limit we recover ( p, q) conformal minimal models coupled to 2d gravity, thereby relating non-critical string theories to type II superstrings on Calabi-Yau backgrounds.
Introduction to gauge theories and the Standard Model
de Wit, Bernard
1995-01-01
The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.
A Quantitative Causal Model Theory of Conditional Reasoning
Fernbach, Philip M.; Erb, Christopher D.
2013-01-01
The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…
The Properties of Model Selection when Retaining Theory Variables
DEFF Research Database (Denmark)
Hendry, David F.; Johansen, Søren
Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...
Evidence-Based Theory of Market Manipulation And Application: The Malaysian Case
Heong, Yin Yun
2010-01-01
According to Part IX Division 1 in Securities Industry Act 1983 of Malaysia Law, stock market manipulation is defined as unlawful action taken either direct or indirectly by any person, to affect the price of securities of the corporation on a stock market in Malaysia for the purpose which may include the purpose of inducing other persons. Extending the framework of Allen and Gale (1992), the Author presents a theory based on the empirical evidence from prosecuted stock market manipulation ca...
Evidence-Based Theory of Market Manipulation And Application: The Malaysian Case
Heong, Yin Yun
2010-01-01
According to Part IX Division 1 in Securities Industry Act 1983 of Malaysia Law, stock market manipulation is defined as unlawful action taken either direct or indirectly by any person, to affect the price of securities of the corporation on a stock market in Malaysia for the purpose which may include the purpose of inducing other persons. Extending the framework of Allen and Gale (1992), the Author presents a theory based on the empirical evidence from prosecuted stock market manipulation ca...
Application of multidimensional item response theory models to longitudinal data
Marvelde, te Janneke M.; Glas, Cees A.W.; Van Landeghem, Georges; Van Damme, Jan
2006-01-01
The application of multidimensional item response theory (IRT) models to longitudinal educational surveys where students are repeatedly measured is discussed and exemplified. A marginal maximum likelihood (MML) method to estimate the parameters of a multidimensional generalized partial credit model
Item response theory modeling with nonignorable missing data
Pimentel, Jonald L.
2005-01-01
This thesis discusses methods to detect nonignorable missing data and methods to adjust for the bias caused by nonignorable missing data, both by introducing a model for the missing data indicator using item response theory (IRT) models.
Why are physicians not persuaded by scientific evidence? A grounded theory interview study.
Sekimoto, Miho; Imanaka, Yuichi; Kitano, Nobuko; Ishizaki, Tatsuro; Takahashi, Osamu
2006-07-27
The government-led "evidence-based guidelines for cataract treatment" labelled pirenoxine and glutathione eye drops, which have been regarded as the standard care for cataracts in Japan, as lacking evidence of effectiveness, causing great upset among ophthalmologists and professional ophthalmology societies. This study investigated the reasons why such "scientific evidence of treatment effectiveness" is not easily accepted by physicians, and thus, why they do not change their clinical practices to reflect such evidence. We conducted a qualitative study based on grounded theory to explore physicians' awareness of "scientific evidence" and evidence-supported treatment in relation to pirenoxine and glutathione eye drops, and to identify current barriers to the implementation of evidence-based policies in clinical practice. Interviews were conducted with 35 ophthalmologists and 3 general practitioners on their prescribing behaviours, perceptions of eye drop effectiveness, attitudes toward the eye drop guideline recommendations, and their perceptions of "scientific evidence." Although few physicians believed that eye drops are remarkably effective, the majority of participants reported that they prescribed eye drops to patients who asked for them, and that such patients accounted for a considerable proportion of those with cataracts. Physicians seldom attempted to explain to patients the limitations of effectiveness or to encourage them to stop taking the eye drops. Physicians also acknowledged the benefits of prescribing such drugs, which ultimately outweighed any uncertainty of their effectiveness. These benefits included economic incentives and a desire to be appreciated by patients. Changes in clinical practice were considered to bring little benefit to physicians or patients. Government approval, rarity of side effects, and low cost of the drops also encouraged prescription. Physicians occasionally provide treatment without expecting remarkable therapeutic
Why are physicians not persuaded by scientific evidence? A grounded theory interview study
Directory of Open Access Journals (Sweden)
Ishizaki Tatsuro
2006-07-01
Full Text Available Abstract Background The government-led "evidence-based guidelines for cataract treatment" labelled pirenoxine and glutathione eye drops, which have been regarded as the standard care for cataracts in Japan, as lacking evidence of effectiveness, causing great upset among ophthalmologists and professional ophthalmology societies. This study investigated the reasons why such "scientific evidence of treatment effectiveness" is not easily accepted by physicians, and thus, why they do not change their clinical practices to reflect such evidence. Methods We conducted a qualitative study based on grounded theory to explore physicians' awareness of "scientific evidence" and evidence-supported treatment in relation to pirenoxine and glutathione eye drops, and to identify current barriers to the implementation of evidence-based policies in clinical practice. Interviews were conducted with 35 ophthalmologists and 3 general practitioners on their prescribing behaviours, perceptions of eye drop effectiveness, attitudes toward the eye drop guideline recommendations, and their perceptions of "scientific evidence." Results Although few physicians believed that eye drops are remarkably effective, the majority of participants reported that they prescribed eye drops to patients who asked for them, and that such patients accounted for a considerable proportion of those with cataracts. Physicians seldom attempted to explain to patients the limitations of effectiveness or to encourage them to stop taking the eye drops. Physicians also acknowledged the benefits of prescribing such drugs, which ultimately outweighed any uncertainty of their effectiveness. These benefits included economic incentives and a desire to be appreciated by patients. Changes in clinical practice were considered to bring little benefit to physicians or patients. Government approval, rarity of side effects, and low cost of the drops also encouraged prescription. Conclusion Physicians occasionally
Theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena
Jin, Yongmei M.; Wang, Yu U.; Ren, Yang
2015-12-01
Pre-martensitic phenomena, also called martensite precursor effects, have been known for decades while yet remain outstanding issues. This paper addresses pre-martensitic phenomena from new theoretical and experimental perspectives. A statistical mechanics-based Grüneisen-type phonon theory is developed. On the basis of deformation-dependent incompletely softened low-energy phonons, the theory predicts a lattice instability and pre-martensitic transition into elastic-phonon domains via 'phonon spinodal decomposition.' The phase transition lifts phonon degeneracy in cubic crystal and has a nature of phonon pseudo-Jahn-Teller lattice instability. The theory and notion of phonon domains consistently explain the ubiquitous pre-martensitic anomalies as natural consequences of incomplete phonon softening. The phonon domains are characterised by broken dynamic symmetry of lattice vibrations and deform through internal phonon relaxation in response to stress (a particular case of Le Chatelier's principle), leading to previously unexplored new domain phenomenon. Experimental evidence of phonon domains is obtained by in situ three-dimensional phonon diffuse scattering and Bragg reflection using high-energy synchrotron X-ray single-crystal diffraction, which observes exotic domain phenomenon fundamentally different from usual ferroelastic domain switching phenomenon. In light of the theory and experimental evidence of phonon domains and their roles in pre-martensitic phenomena, currently existing alternative opinions on martensitic precursor phenomena are revisited.
Theory of stellar convection - II. First stellar models
Pasetto, S.; Chiosi, C.; Chiosi, E.; Cropper, M.; Weiss, A.
2016-07-01
We present here the first stellar models on the Hertzsprung-Russell diagram, in which convection is treated according to the new scale-free convection theory (SFC theory) by Pasetto et al. The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few per cent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients ∇ and ∇e, and energy fluxes that are very similar to those derived from the `calibrated' MT theory for main-sequence stars. We conclude that the old scale dependent ML theory can now be replaced with a self-consistent scale-free theory able to predict correct results, as it is more physically grounded than the ML theory. Fundamentally, the SFC theory offers a deeper insight of the underlying physics than numerical simulations.
Flipped classroom model for learning evidence-based medicine
National Research Council Canada - National Science Library
Rucker SY; Ozdogan Z; Al Achkar M
2017-01-01
.... A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians...
Large field inflation models from higher-dimensional gauge theories
Furuuchi, Kazuyuki; Koyama, Yoji
2015-02-01
Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante's Inferno model turns out to be the most preferred model in this framework.
Large field inflation models from higher-dimensional gauge theories
Energy Technology Data Exchange (ETDEWEB)
Furuuchi, Kazuyuki [Manipal Centre for Natural Sciences, Manipal University, Manipal, Karnataka 576104 (India); Koyama, Yoji [Department of Physics, National Tsing-Hua University, Hsinchu 30013, Taiwan R.O.C. (China)
2015-02-23
Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante’s Inferno model turns out to be the most preferred model in this framework.
Substandard model? At last, a good reason to opt for a sexier theory of particle physics
Cho, A
2001-01-01
According to experimenters at Brookhaven, a tiny discrepancy in the magnetism of the muon may signal a crack in the Standard Model. The deviation could be the first piece of hard evidence for a more complete theory called supersymmetry (1 page).
Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun
2017-01-01
Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.
Hoffman, Aubri S; Volk, Robert J; Saarimaki, Anton; Stirling, Christine; Li, Linda C; Härter, Martin; Kamath, Geetanjali R; Llewellyn-Thomas, Hilary
2013-01-01
In 2005, the International Patient Decision Aids Standards Collaboration identified twelve quality dimensions to guide assessment of patient decision aids. One dimension-the delivery of patient decision aids on the Internet-is relevant when the Internet is used to provide some or all components of a patient decision aid. Building on the original background chapter, this paper provides an updated definition for this dimension, outlines a theoretical rationale, describes current evidence, and discusses emerging research areas. An international, multidisciplinary panel of authors examined the relevant theoretical literature and empirical evidence through 2012. The updated definition distinguishes Internet-delivery of patient decision aids from online health information and clinical practice guidelines. Theories in cognitive psychology, decision psychology, communication, and education support the value of Internet features for providing interactive information and deliberative support. Dissemination and implementation theories support Internet-delivery for providing the right information (rapidly updated), to the right person (tailored), at the right time (the appropriate point in the decision making process). Additional efforts are needed to integrate the theoretical rationale and empirical evidence from health technology perspectives, such as consumer health informatics, user experience design, and human-computer interaction. As of 2012, the updated theoretical rationale and emerging evidence suggest potential benefits to delivering patient decision aids on the Internet. However, additional research is needed to identify best practices and quality metrics for Internet-based development, evaluation, and dissemination, particularly in the areas of interactivity, multimedia components, socially-generated information, and implementation strategies.
Theories, models and urban realities. From New York to Kathmandu
Directory of Open Access Journals (Sweden)
Román Rodríguez González
2004-12-01
Full Text Available At the beginning of the 21st century, there are various social theories that speak of global changes in the history of human civilization. Urban models have been through obvious changes throughout the last century according to the important transformation that are pro-posed by previous general theories. Nevertheless global diversity contradicts the generaliza-tion of these theories and models. From our own simple observations and reflections we arrive at conclusions that distance themselves from the prevailing theory of our civilized world. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kath-mandu still have more internal differences than similarities.
Theories, models and urban realities. From New York to Kathmandu
Directory of Open Access Journals (Sweden)
José Somoza Medina
2004-01-01
Full Text Available At the beginning of the 21st century, there are various social theories that speak of globalchanges in the history of human civilization. Urban models have been through obviouschanges throughout the last century according to the important transformation that are proposedby previous general theories. Nevertheless global diversity contradicts the generalizationof these theories and models. From our own simple observations and reflections wearrive at conclusions that distance themselves from the prevailing theory of our civilizedworld. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kathmandustill have more internal differences than similarities.
Toric Methods in F-Theory Model Building
Directory of Open Access Journals (Sweden)
Johanna Knapp
2011-01-01
Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.
General autocatalytic theory and simple model of financial markets
Thuy Anh, Chu; Lan, Nguyen Tri; Viet, Nguyen Ai
2015-06-01
The concept of autocatalytic theory has become a powerful tool in understanding evolutionary processes in complex systems. A generalization of autocatalytic theory was assumed by considering that the initial element now is being some distribution instead of a constant value as in traditional theory. This initial condition leads to that the final element might have some distribution too. A simple physics model for financial markets is proposed, using this general autocatalytic theory. Some general behaviours of evolution process and risk moment of a financial market also are investigated in framework of this simple model.
Screening Masses of Hot SU(2) Gauge Theory from the 3D Adjoint Higgs Model
Karsch, Frithjof; Petreczky, P
1999-01-01
We study the Landau gauge propagators of the lattice SU(2) 3d adjoint Higgs model, considered as an effective theory of high temperature 4d SU(2) gauge theory. From the long distance behaviour of the propagators we extract the screening masses. It is shown that the pole masses extracted from the propagators agree well with the screening masses obtained recently in finite temperature SU(2) theory. The relation of the propagator masses to the masses extracted from gauge invariant correlators is also discussed. In so-called lambda gauges non-perturbative evidence is given for the gauge independence of pole masses within this class of gauges.
Optimization models using fuzzy sets and possibility theory
Orlovski, S
1987-01-01
Optimization is of central concern to a number of discip lines. Operations Research and Decision Theory are often consi dered to be identical with optimizationo But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i. e. the solutions were considered to be either fea sible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeller to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real prob lems. This is particularly true if the problem under considera tion includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if na tural language has to be...
Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat
2017-01-23
Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record
Dimensional reduction of Markov state models from renormalization group theory
Orioli, S.; Faccioli, P.
2016-09-01
Renormalization Group (RG) theory provides the theoretical framework to define rigorous effective theories, i.e., systematic low-resolution approximations of arbitrary microscopic models. Markov state models are shown to be rigorous effective theories for Molecular Dynamics (MD). Based on this fact, we use real space RG to vary the resolution of the stochastic model and define an algorithm for clustering microstates into macrostates. The result is a lower dimensional stochastic model which, by construction, provides the optimal coarse-grained Markovian representation of the system's relaxation kinetics. To illustrate and validate our theory, we analyze a number of test systems of increasing complexity, ranging from synthetic toy models to two realistic applications, built form all-atom MD simulations. The computational cost of computing the low-dimensional model remains affordable on a desktop computer even for thousands of microstates.
Dimensional reduction of Markov state models from renormalization group theory.
Orioli, S; Faccioli, P
2016-09-28
Renormalization Group (RG) theory provides the theoretical framework to define rigorous effective theories, i.e., systematic low-resolution approximations of arbitrary microscopic models. Markov state models are shown to be rigorous effective theories for Molecular Dynamics (MD). Based on this fact, we use real space RG to vary the resolution of the stochastic model and define an algorithm for clustering microstates into macrostates. The result is a lower dimensional stochastic model which, by construction, provides the optimal coarse-grained Markovian representation of the system's relaxation kinetics. To illustrate and validate our theory, we analyze a number of test systems of increasing complexity, ranging from synthetic toy models to two realistic applications, built form all-atom MD simulations. The computational cost of computing the low-dimensional model remains affordable on a desktop computer even for thousands of microstates.
Turbulent Boundary Layers - Experiments, Theory and Modelling
1980-01-01
DEVELOPMENT (ORGANISATION DU TRAITE DE L’ATLANTIQUE NORD ) AGARD Conference Proceedings No.271 TURBULENT BOUNDARY LAYERS - EXPERIMENTS, THEORY AND...photographs of Figures 21 and 22. In this case, the photographs are taken with a single flash strobe and thus yield the instantaneous positions of the
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
The mathematical theory of reduced MHD models for fusion plasmas
Guillard, Hervé
2015-01-01
The derivation of reduced MHD models for fusion plasma is here formulated as a special instance of the general theory of singular limit of hyperbolic system of PDEs with large operator. This formulation allows to use the general results of this theory and to prove rigorously that reduced MHD models are valid approximations of the full MHD equations. In particular, it is proven that the solutions of the full MHD system converge to the solutions of an appropriate reduced model.
Teo, Timothy; Tan, Lynde
2012-01-01
This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…
Teo, Timothy; Tan, Lynde
2012-01-01
This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…
Gutzwiller variational theory for the Hubbard model with attractive interaction.
Bünemann, Jörg; Gebhard, Florian; Radnóczi, Katalin; Fazekas, Patrik
2005-06-29
We investigate the electronic and superconducting properties of a negative-U Hubbard model. For this purpose we evaluate a recently introduced variational theory based on Gutzwiller-correlated BCS wavefunctions. We find significant differences between our approach and standard BCS theory, especially for the superconducting gap. For small values of |U|, we derive analytical expressions for the order parameter and the superconducting gap which we compare to exact results from perturbation theory.
Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models
2015-01-01
We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bos...
Geometry model construction in infrared image theory simulation of buildings
Institute of Scientific and Technical Information of China (English)
谢鸣; 李玉秀; 徐辉; 谈和平
2004-01-01
Geometric model construction is the basis of infrared image theory simulation. Taking the construction of the geometric model of one building in Harbin as an example, this paper analyzes the theoretical groundings of simplification and principles of geometric model construction of buildings. It then discusses some particular treatment methods in calculating the radiation transfer coefficient in geometric model construction using the Monte Carlo Method.
Modeling Routinization in Games: An Information Theory Approach
DEFF Research Database (Denmark)
Wallner, Simon; Pichlmair, Martin; Hecher, Michael
2015-01-01
-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented......Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete...
Modeling Routinization in Games: An Information Theory Approach
DEFF Research Database (Denmark)
Wallner, Simon; Pichlmair, Martin; Hecher, Michael
2015-01-01
Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...
Attachment and the Processing of Social Information across the Life Span: Theory and Evidence
Dykas, Matthew J.; Cassidy, Jude
2011-01-01
Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the…
Klaassen, Ger; Nentjes, Andries; Smith, Mark
2005-01-01
Simulation models and theory prove that emission trading converges to market equilibrium. This paper sets out to test these results using experimental economics. Three experiments are conducted for the six largest carbon emitting industrialized regions. Two experiments use auctions, the first a
Klaassen, Ger; Nentjes, Andries; Smith, Mark
2005-01-01
Simulation models and theory prove that emission trading converges to market equilibrium. This paper sets out to test these results using experimental economics. Three experiments are conducted for the six largest carbon emitting industrialized regions. Two experiments use auctions, the first a sing
Theories and models of globalization ethicizing
Directory of Open Access Journals (Sweden)
Dritan Abazović
2016-05-01
Full Text Available Globalization as a phenomenon is under the magnifying glass of many philosophical discussions and theoretical deliberations. While most theorists deal with issues that are predominantly of economic or political character, this article has a different logic. The article presents six theories which in their own way explain the need for movement by ethicizing globalization. Globalization is a process that affects all and as such it has become inevitable, but it is up the people to determine its course and make it either functional or uncontrolled. The survival and development of any society is measured primarily by the quality of its moral and ethical foundation. Therefore, it is clear that global society can survive and be functional only if it finds a minimum consensus on ethical norms or, as said in theory, if it establishes its ethical system based on which it would be built and developed.
[Models of economic theory of population growth].
Von Zameck, W
1987-01-01
"The economic theory of population growth applies the opportunity cost approach to the fertility decision. Variations and differentials in fertility are caused by the available resources and relative prices or by the relative production costs of child services. Pure changes in real income raise the demand for children or the total amount spent on children. If relative prices or production costs and real income are affected together the effect on fertility requires separate consideration." (SUMMARY IN ENG)
Measurement Models for Reasoned Action Theory
Hennessy, Michael; Bleakley, Amy; FISHBEIN, MARTIN
2012-01-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...
Theory, modeling, and simulation annual report, 1992
Energy Technology Data Exchange (ETDEWEB)
1993-05-01
This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.
A Model of the Economic Theory of Regulation for Undergraduates.
Wilson, Brooks
1995-01-01
Presents a model of the economic theory of regulation and recommends its use in undergraduate economics classes. Describes the use of computer-assisted instruction to teach the theory. Maintains that the approach enables students to gain access to graphs and tables that they produce themselves. (CFR)
A continuum theory for modeling the dynamics of crystalline materials.
Xiong, Liming; Chen, Youping; Lee, James D
2009-02-01
This paper introduces a multiscale field theory for modeling and simulation of the dynamics of crystalline materials. The atomistic formulation of a multiscale field theory is briefly introduced. Its applicability is discussed. A few application examples, including phonon dispersion relations of ferroelectric materials BiScO3 and MgO nano dot under compression are presented.
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2016-11-09
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan
2017-09-01
Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.
Extended Nambu models: Their relation to gauge theories
Escobar, C. A.; Urrutia, L. F.
2017-05-01
Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.
A Reflection on Research, Theory, Evidence-based Practice, and Quality Improvement
Directory of Open Access Journals (Sweden)
Eesa Mohammadi
2016-04-01
While each process is associated with its unique characteristics, overlaps are likely to appear between each of the two processes. For instance, in the EBP process, if one discovers (theory that evidence is inadequate to implement a certain intervention, it highlights the need for research on that specific subject. Similarly, QI may lead to the identification of new questions, which could be used for research purposes. All the discussed processes, as well as their scientific and professional dimensions, are essential to nursing disciplines in healthcare systems.
A Model of PCF in Guarded Type Theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....
Classical conformality in the Standard Model from Coleman's theory
Kawana, Kiyoharu
2016-01-01
The classical conformality is one of the possible candidates for explaining the gauge hierarchy of the Standard Model. We show that it is naturally obtained from the Coleman's theory on baby universe.
Linear control theory for gene network modeling.
Shin, Yong-Jun; Bleris, Leonidas
2010-09-16
Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Contemporary Cognitive Behavior Therapy: A Review of Theory, History, and Evidence.
Thoma, Nathan; Pilecki, Brian; McKay, Dean
2015-09-01
Cognitive behavior therapy (CBT) has come to be a widely practiced psychotherapy throughout the world. The present article reviews theory, history, and evidence for CBT. It is meant as an effort to summarize the forms and scope of CBT to date for the uninitiated. Elements of CBT such as cognitive therapy, behavior therapy, and so-called "third wave" CBT, such as dialectical behavior therapy (DBT) and acceptance and commitment therapy (ACT) are covered. The evidence for the efficacy of CBT for various disorders is reviewed, including depression, anxiety disorders, personality disorders, eating disorders, substance abuse, schizophrenia, chronic pain, insomnia, and child/adolescent disorders. The relative efficacy of medication and CBT, or their combination, is also briefly considered. Future directions for research and treatment development are proposed.
Kelly, Peter J; Deane, Frank P; Lovett, Megan J
2012-09-01
There is considerable discrepancy between what is considered evidence-based practice (EBP) and what is actually delivered in substance abuse treatment settings. The Theory of Planned Behavior (TpB) is a well-established model that may assist in better understanding clinician's intentions to use EBPs. A total of 106 residential substance abuse workers employed by The Salvation Army participated in the current study. The workers completed an anonymous survey that assessed attitudes toward EBP and examined the constructs within the TpB. A linear regression analysis was used to predict clinicians' intentions to use EBPs. Overall, the model accounted for 41% of the variance in intentions, with attitudes, subjective norms, and perceived behavioral control all significant predictors. The discussion highlights the potential for social reinforcement in the workplace to promote the implementation of EBPs.
A QCD Model Using Generalized Yang-Mills Theory
Institute of Scientific and Technical Information of China (English)
WANG Dian-Fu; SONG He-Shan; KOU Li-Na
2007-01-01
Generalized Yang-Mills theory has a covariant derivative,which contains both vector and scalar gauge bosons.Based on this theory,we construct a strong interaction model by using the group U(4).By using this U(4)generalized Yang-Mills model,we also obtain a gauge potential solution,which can be used to explain the asymptotic behavior and color confinement.
Matrix models vs. Seiberg-Witten/Whitham theories
Energy Technology Data Exchange (ETDEWEB)
Chekhov, L.; Mironov, A
2003-01-23
We discuss the relation between matrix models and the Seiberg-Witten type (SW) theories, recently proposed by Dijkgraaf and Vafa. In particular, we prove that the partition function of the Hermitian one-matrix model in the planar (large N) limit coincides with the prepotential of the corresponding SW theory. This partition function is the logarithm of a Whitham {tau}-function. The corresponding Whitham hierarchy is explicitly constructed. The double-point problem is solved.
Bianchi class A models in Sàez-Ballester's theory
Socorro, J.; Espinoza-García, Abraham
2012-08-01
We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.
A Dynamic Systems Theory Model of Visual Perception Development
Coté, Carol A.
2015-01-01
This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…
Mathematical System Theory and System Modeling
1980-01-01
Choosing models related effectively to the questions to be addressed is a central issue in the craft of systems analysis. Since the mathematical description the analyst chooses constrains the types of issues he candeal with, it is important for these models to be selected so as to yield limitations that are acceptable in view of the questions the systems analysis seeks to answer. In this paper, the author gives an overview of the central issues affecting the question of model choice. To ...
Szymanski, David Lawrence
This thesis presents a new approach for classifying Landsat 5 Thematic Mapper (TM) imagery that utilizes digitally represented, non-spectral data in the classification step. A classification algorithm that is based on the Dempster-Shafer theory of evidence is developed and tested for its ability to provide an accurate representation of forest cover on the ground at the Anderson et al (1976) level II. The research focuses on defining an objective, systematic method of gathering and assessing the evidence from digital sources including TM data, the normalized difference vegetation index, soils, slope, aspect, and elevation. The algorithm is implemented using the ESRI ArcView Spatial Analyst software package and the Grid spatial data structure with software coded in both ArcView Avenue and also C. The methodology uses frequency of occurrence information to gather evidence and also introduces measures of evidence quality that quantify the ability of the evidence source to differentiate the Anderson forest cover classes. The measures are derived objectively and empirically and are based on common principles of legal argument. The evidence assessment measures augment the Dempster-Shafer theory and the research will determine if they provide an argument that is mentally sound, credible, and consistent. This research produces a method for identifying, assessing, and combining evidence sources using the Dempster-Shafer theory that results in a classified image containing the Anderson forest cover class. Test results indicate that the new classifier performs with accuracy that is similar to the traditional maximum likelihood approach. However, confusion among the deciduous and mixed classes remains. The utility of the evidence gathering method and also the evidence assessment method is demonstrated and confirmed. The algorithm presents an operational method of using the Dempster-Shafer theory of evidence for forest classification.
The Neuman Systems Model Institute: testing middle-range theories.
Gigliotti, Eileen
2003-07-01
The credibility of the Neuman systems model can only be established through the generation and testing of Neuman systems model-derived middle-range theories. However, due to the number and complexity of Neuman systems model concepts/concept interrelations and the diversity of middle-range theory concepts linked to these Neuman systems model concepts by researchers, no explicit middle-range theories have yet been derived from the Neuman systems model. This article describes the development of an organized program for the systematic study of the Neuman systems model. Preliminary work, already accomplished, is detailed, and a tentative plan for the completion of further preliminary work as well as beginning the actual research conduction phase is proposed.
Consumer preference models: fuzzy theory approach
Turksen, I. B.; Wilson, I. A.
1993-12-01
Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).
Measurement-based load modeling: Theory and application
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.
Modeling in applied sciences a kinetic theory approach
Pulvirenti, Mario
2000-01-01
Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...
The Family FIRO Model: The Integration of Group Theory and Family Theory.
Colangelo, Nicholas; Doherty, William J.
1988-01-01
Presents the Family Fundamental Interpersonal Relations Orientation (Family FIRO) Model, an integration of small-group theory and family therapy. The model is offered as a framework for organizing family issues. Discusses three fundamental issues of human relatedness and their applicability to group dynamics. (Author/NB)
Modeling acquaintance networks based on balance theory
Directory of Open Access Journals (Sweden)
Vukašinović Vida
2014-09-01
Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models
Theory-based Practice: Comparing and Contrasting OT Models
DEFF Research Database (Denmark)
Nielsen, Kristina Tomra; Berg, Brett
2012-01-01
Theory- Based Practice: Comparing and Contrasting OT Models The workshop will present a critical analysis of the major models of occupational therapy, A Model of Human Occupation, Enabling Occupation II, and Occupational Therapy Intervention Process Model. Similarities and differences among...... the models will be discussed, including each model’s limitations and unique contributions to the profession. Workshop format will include short lectures and group discussions....
Training evaluation models: Theory and applications
Carbone, V.; MORVILLO, A
2002-01-01
This chapter has the following aims: 1. Compare the various conceptual models for evaluation, identifying their strengths and weaknesses; 2. Define an evaluation model consistent with the aims and constraints of the fit project; 3. Describe, in critical fashion, operative tools for evaluating training which are reliable, flexible and analytical.
Baldrige Theory into Practice: A Generic Model
Arif, Mohammed
2007-01-01
Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…
Baldrige Theory into Practice: A Generic Model
Arif, Mohammed
2007-01-01
Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…
Measurement Models for Reasoned Action Theory.
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-03-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.
Optimal transportation networks models and theory
Bernot, Marc; Morel, Jean-Michel
2009-01-01
The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.
Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model
Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal
How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.
Scientific evidence of the homeopathic epistemological model
Directory of Open Access Journals (Sweden)
Marcus Zulian Teixeira
2011-03-01
Full Text Available Homeopathy is based on principles and a system of knowledge different from the ones supporting the conventional biomedical model: this epistemological conflict is the underlying reason explaining why it is so difficult to accept by present-day scientific reason. To legitimize homeopathy according to the standards of the latter, research must confirm the validity of its basic assumptions: principle of therapeutic similitude, trials of medicines on healthy individuals, individualized prescriptions and use of high dilutions. Correspondingly, basic research must supply experimental data and models to substantiate these principles of homeopathy, whilst clinical trials aim at confirming the efficacy and effectiveness of homeopathy in the treatment of disease. This article discusses the epistemological model of homeopathy relating its basic assumptions with data resulting from different fields of modern experimental research and supporting its therapeutic use on the outcomes of available clinical trials. In this regard, the principle of individualization of treatment is the sine qua non condition to make therapeutic similitude operative and consequently for homeopathic treatment to exhibit clinical efficacy and effectiveness.
Quantum Field Theory and the Electroweak Standard Model
Boos, E
2015-01-01
The Standard Model is one of the main intellectual achievements for about the last 50 years, a result of many theoretical and experimental studies. In this lecture a brief introduction to the electroweak part of the Standard Model is given. Since the Standard Model is a quantum field theory, some aspects for understanding of quantization of abelian and non-abelian gauge theories are also briefly discussed. It is demonstrated how well the electroweak Standard Model works in describing a large variety of precise experimental measure- ments at lepton and hadron collider.
Sticker DNA computer model--Part Ⅰ:Theory
Institute of Scientific and Technical Information of China (English)
XU Jin; DONG Yafei; WEI Xiaopeng
2004-01-01
Sticker model is one of the basic models in the DNA computer models. This model is coded with single-double stranded DNA molecules. It has the following advantages that the operations require no strands extension and use no enzymes; What's more, the materials are reusable. Therefore it arouses attention and interest of scientists in many fields. In this paper, we will systematically analyze the theories and applications of the model, summarize other scientists' contributions in this field, and propose our research results. This paper is the theoretical portion of the sticker model on DNA computer, which includes the introduction of the basic model of sticker computing. Firstly, we systematically introduce the basic theories of classic models about sticker computing; Secondly, we discuss the sticker system which is an abstract computing model based on the sticker model and formal languages; Finally, extend and perfect the model, and present two types of models that are more extensive in the applications and more perfect in the theory than the past models: one is the so-called k-bit sticker model, the other is full-message sticker DNA computing model.
Theory of stellar convection II: first stellar models
Pasetto, S; Chiosi, E; Cropper, M; Weiss, A
2015-01-01
We present here the first stellar models on the Hertzsprung-Russell diagram (HRD), in which convection is treated according to the novel scale-free convection theory (SFC theory) by Pasetto et al. (2014). The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few percent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients of the ambient and of the convective element, and energy fluxes that are very similar to those derived from the "calibrated" MT theory for main s...
Measuring Convergence using Dynamic Equilibrium Models: Evidence from Chinese Provinces
DEFF Research Database (Denmark)
Pan, Lei; Posch, Olaf; van der Wel, Michel
We propose a model to study economic convergence in the tradition of neoclassical growth theory. We employ a novel stochastic set-up of the Solow (1956) model with shocks to both capital and labor. Our novel approach identifies the speed of convergence directly from estimating the parameters which...
Mixed models theory and applications with R
Demidenko, Eugene
2013-01-01
Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g
Direct evidence for a Coulombic phase in monopole-suppressed SU(2) lattice gauge theory
Grady, Michael
2013-01-01
Further evidence is presented for the existence of a non-confining phase at weak coupling in SU(2) lattice gauge theory. Using Monte Carlo simulations with the standard Wilson action, gauge-invariant SO(3)-Z2 monopoles, which are strong-coupling lattice artifacts, have been seen to undergo a percolation transition exactly at the phase transition previously seen using Coulomb-gauge methods, with an infinite lattice critical point near $\\beta = 3.2$. The theory with both Z2 vortices and monopoles and SO(3)-Z2 monopoles eliminated is simulated in the strong coupling ($\\beta = 0$) limit on lattices up to $60^4$. Here, as in the high-$\\beta$ phase of the Wilson action theory, finite size scaling shows it spontaneously breaks the remnant symmetry left over after Coulomb gauge fixing. Such a symmetry breaking precludes the potential from having a linear term. The monopole restriction appears to prevent the transition to a confining phase at any $\\beta$. Direct measurement of the instantaneous Coulomb potential shows...
Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C.; Escoffery, Cam T.; Herrmann, Alison K.; Thatcher, Esther; Hartman, Marieke A.; Fernandez, Maria
2017-01-01
Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners’ capacity to adopt and implement a variety of EBIs across diverse practice contexts. PMID:26500080
Mothersill, Omar; Tangney, Noreen; Morris, Derek W; McCarthy, Hazel; Frodl, Thomas; Gill, Michael; Corvin, Aiden; Donohoe, Gary
2017-06-01
Resting-state functional magnetic resonance imaging (rs-fMRI) has repeatedly shown evidence of altered functional connectivity of large-scale networks in schizophrenia. The relationship between these connectivity changes and behaviour (e.g. symptoms, neuropsychological performance) remains unclear. Functional connectivity in 27 patients with schizophrenia or schizoaffective disorder, and 25 age and gender matched healthy controls was examined using rs-fMRI. Based on seed regions from previous studies, we examined functional connectivity of the default, cognitive control, affective and attention networks. Effects of symptom severity and theory of mind performance on functional connectivity were also examined. Patients showed increased connectivity between key nodes of the default network including the precuneus and medial prefrontal cortex compared to controls (ptheory of mind performance were both associated with altered connectivity of default regions within the patient group (pschizophrenia spectrum patients and reveals an association between altered default connectivity and positive symptom severity. As a novel find, this study also shows that default connectivity is correlated to and predictive of theory of mind performance. Extending these findings by examining the effects of emerging social cognition treatments on both default connectivity and theory of mind performance is now an important goal for research. Copyright © 2016 Elsevier B.V. All rights reserved.
A Model of Federated Evidence Fusion for Real-time Urban Traffic State Estimation
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
In order to make full use of heterogeneous multi-sensor data to serve urban intelligent transportation systems, a real-time urban traffic state fusion model was proposed, named federated evidence fusion model. The model improves conventional D-S evidence theory in temporal domain, such that it can satisfy the requirement of real-time processing and utilize traffic detection information more efficaciously. The model frame and computational procedures are given. In addition, a generalized reliability weight matrix of evidence is also presented to increase the accuracy of estimation. After that, a simulation test is presented to explain the advantage of the proposed method in comparison with conventional D-S evidence theory. Besides, the validity of the model is proven by the use of the data of loop detectors and GPS probe vehicles collected from an urban link in Shanghai. Results of the experiment show that the proposed approach can well embody and track traffic state at character level in real-time conditions.
Application of Adult Learning Theory in Teaching Evidence Based Medicine to Residents
Directory of Open Access Journals (Sweden)
Alexandra Halalau
2016-10-01
Full Text Available Background and Purpose: Adequate evidence based medicine (EBM knowledge and skills are required to provide up-to-date, high quality medical care to patients. Unfortunately, achieving these skills and knowledge requires a prolonged learning process and constant exposure to EBM concepts. The adult learning theory holds the assumption that adults learn better in a problem based and collaborative environment, with more equality in between the teacher and the learner. We aimed to evaluate EBM learning outcomes one year after the implementation of a longitudinal EBM curriculum into an internal medicine residency program. Methods: An EBM curriculum based on the adult learning theory was developed. It included five specific components that addressed all five steps of the EBM cycle (ask, acquire, appraise, apply and assess. A voluntary, anonymous, 27-question survey was distributed to all residents prior to and at the end of the one year of EBM training to self-assess competencies in EBM. Results: Of the 60 eligible residents, 10 pre-course and 13 post-course completed the survey with a response rate of 17% and 22%, respectively. Self-reported conceptual understanding improved for: relative risk 14%, odds ratio 14%, confidence intervals 27%, and number needed to treat 12%. Comfort with meta-analysis appraisal improved, from 30% to 38%. Routine appraisal sheet use increased by 31%. A 17% increase in satisfaction with the EBM curriculum was reported. Conclusions: Our intervention successfully increased residents’ comfort with EBM concepts and self-reported application of EBM skills and knowledge about patient care. The results of the implementation of the EBM curriculum were promising and suggested valuable implications for EBM faculty to further collaborate with residents in continuous improvement of the EBM learning experiences for advancing the quality of patient care. Keywords: EVIDENCE BASED MEDICINE, ADULT LEARNING THEORY, RESIDENTS, CURRICULUM
Solid mechanics theory, modeling, and problems
Bertram, Albrecht
2015-01-01
This textbook offers an introduction to modeling the mechanical behavior of solids within continuum mechanics and thermodynamics. To illustrate the fundamental principles, the book starts with an overview of the most important models in one dimension. Tensor calculus, which is called for in three-dimensional modeling, is concisely presented in the second part of the book. Once the reader is equipped with these essential mathematical tools, the third part of the book develops the foundations of continuum mechanics right from the beginning. Lastly, the book’s fourth part focuses on modeling the mechanics of materials and in particular elasticity, viscoelasticity and plasticity. Intended as an introductory textbook for students and for professionals interested in self-study, it also features numerous worked-out examples to aid in understanding.
Matrix Models, Topological Strings, and Supersymmetric Gauge Theories
Dijkgraaf, R; Dijkgraaf, Robbert; Vafa, Cumrun
2002-01-01
We show that B-model topological strings on local Calabi-Yau threefolds are large N duals of matrix models, which in the planar limit naturally give rise to special geometry. These matrix models directly compute F-terms in an associated N=1 supersymmetric gauge theory, obtained by deforming N=2 theories by a superpotential term that can be directly identified with the potential of the matrix model. Moreover by tuning some of the parameters of the geometry in a double scaling limit we recover (p,q) conformal minimal models coupled to 2d gravity, thereby relating non-critical string theories to type II superstrings on Calabi-Yau backgrounds.
Matrix models, topological strings, and supersymmetric gauge theories
Energy Technology Data Exchange (ETDEWEB)
Dijkgraaf, Robbert E-mail: rhd@science.uva.nl; Vafa, Cumrun
2002-11-11
We show that B-model topological strings on local Calabi-Yau threefolds are large-N duals of matrix models, which in the planar limit naturally give rise to special geometry. These matrix models directly compute F-terms in an associated N=1 supersymmetric gauge theory, obtained by deforming N=2 theories by a superpotential term that can be directly identified with the potential of the matrix model. Moreover by tuning some of the parameters of the geometry in a double scaling limit we recover (p,q) conformal minimal models coupled to 2d gravity, thereby relating non-critical string theories to type II superstrings on Calabi-Yau backgrounds.
Strategic collaborative model for evidence-based nursing practice.
Olade, Rosaline A
2004-01-01
To describe a model that has been developed to guide nurses and other health professionals in collaborative efforts toward evidence-based nursing practice. A review of literature was conducted using MEDLINE and CINAHL to search for articles on research utilization for evidence-based practice in health care delivery. Empirical studies; reviews; and theoretical, opinion, and information articles were included in the review in order to provide a more comprehensive view of the state of evidence-based nursing internationally. Findings revealed a number of barriers to evidence-based nursing practice, which have persisted over the last two decades, including inadequate knowledge of research among practicing nurses, lack of administrative support for research activities in clinical settings, lack of empowerment of nurses, and lack of needed mentoring from nursing research consultants. Barriers in the areas of nursing education and administrative support appear to be major. A need was identified for a pragmatic model that encourages cooperation and collaboration between educators/researchers in academia and the administrative leaders in the clinical facilities if evidence-based nursing practice is to become the norm. FRAMEWORK OF MODEL: The Tyler Collaborative Model is based on an eclectic approach to planned change for creating evidence-based practice. This model identifies a step-by-step process for change, while allowing for the opportunity to integrate any of the previously available methods of critical appraisal to determine the best evidence for practice in each clinical setting.
Evidence for the Multiverse in the Standard Model and Beyond
Hall, Lawrence J
2007-01-01
In any theory it is unnatural if the observed parameters lie very close to special values that determine the existence of complex structures necessary for observers. A naturalness probability, P, is introduced to numerically evaluate the unnaturalness. If P is small in all known theories, there is an observer naturalness problem. In addition to the well-known case of the cosmological constant, we argue that nuclear stability and electroweak symmetry breaking (EWSB) represent significant observer naturalness problems. The naturalness probability associated with nuclear stability is conservatively estimated as P_nuc < 10^{-(3-2)}, and for simple EWSB theories P_EWSB < 10^{-(2-1)}. This pattern of unnaturalness in three different arenas, cosmology, nuclear physics, and EWSB, provides evidence for the multiverse. In the nuclear case the problem is largely solved even with a flat multiverse distribution, and with nontrivial distributions it is possible to understand both the proximity to neutron stability an...
Modeling workplace bullying using catastrophe theory.
Escartin, J; Ceja, L; Navarro, J; Zapf, D
2013-10-01
Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.
CDMBE: A Case Description Model Based on Evidence
Directory of Open Access Journals (Sweden)
Jianlin Zhu
2015-01-01
Full Text Available By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE, which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users’ ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model.
Collectivism and coping: current theories, evidence, and measurements of collective coping.
Kuo, Ben C H
2013-01-01
A burgeoning body of cultural coping research has begun to identify the prevalence and the functional importance of collective coping behaviors among culturally diverse populations in North America and internationally. These emerging findings are highly significant as they evidence culture's impacts on the stress-coping process via collectivistic values and orientation. They provide a critical counterpoint to the prevailing Western, individualistic stress and coping paradigm. However, current research and understanding about collective coping appear to be piecemeal and not well integrated. To address this issue, this review attempts to comprehensively survey, summarize, and evaluate existing research related to collective coping and its implications for coping research with culturally diverse populations from multiple domains. Specifically, this paper reviews relevant research and knowledge on collective coping in terms of: (a) operational definitions; (b) theories; (c) empirical evidence based on studies of specific cultural groups and broad cultural values/dimensions; (d) measurements; and (e) implications for future cultural coping research. Overall, collective coping behaviors are conceived as a product of the communal/relational norms and values of a cultural group across studies. They also encompass a wide array of stress responses ranging from value-driven to interpersonally based to culturally conditioned emotional/cognitive to religion- and spirituality-grounded coping strategies. In addition, this review highlights: (a) the relevance and the potential of cultural coping theories to guide future collective coping research; (b) growing evidence for the prominence of collective coping behaviors particularly among Asian nationals, Asian Americans/Canadians and African Americans/Canadians; (c) preference for collective coping behaviors as a function of collectivism and interdependent cultural value and orientation; and (d) six cultural coping scales. This
Spatial interaction models facility location using game theory
D'Amato, Egidio; Pardalos, Panos
2017-01-01
Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.
Directory of Open Access Journals (Sweden)
Mauro Adenzato
Full Text Available BACKGROUND: The findings of the few studies that have to date investigated the way in which individuals with Anorexia Nervosa (AN navigate their social environment are somewhat contradictory. We undertook this study to shed new light on the social-cognitive profile of patients with AN, analysing Theory of Mind and emotional functioning. Starting from previous evidence on the role of the amygdala in the neurobiology of AN and in the social cognition, we hypothesise preserved Theory of Mind and impaired emotional functioning in patients with AN. METHODOLOGY: Thirty women diagnosed with AN and thirty-two women matched for education and age were involved in the study. Theory of Mind and emotional functioning were assessed with a set of validated experimental tasks. A measure of perceived social support was also used to test the correlations between this dimension and the social-cognitive profile of AN patients. PRINCIPAL FINDINGS: The performance of patients with AN is significantly worse than that of healthy controls on tasks assessing emotional functioning, whereas patients' performance is comparable to that of healthy controls on the Theory of Mind task. Correlation analyses showed no relationship between scores on any of the social-cognition tasks and either age of onset or duration of illness. A correlation between social support and emotional functioning was found. This latter result seems to suggest a potential role of social support in the treatment and recovery of AN. CONCLUSIONS: The pattern of results followed the experimental hypothesis. They may be useful to help us better understand the social-cognitive profile of patients with AN and to contribute to the development of effective interventions based on the ways in which patients with AN actually perceive their social environment.
Electrorheological fluids modeling and mathematical theory
Růžička, Michael
2000-01-01
This is the first book to present a model, based on rational mechanics of electrorheological fluids, that takes into account the complex interactions between the electromagnetic fields and the moving liquid. Several constitutive relations for the Cauchy stress tensor are discussed. The main part of the book is devoted to a mathematical investigation of a model possessing shear-dependent viscosities, proving the existence and uniqueness of weak and strong solutions for the steady and the unsteady case. The PDS systems investigated possess so-called non-standard growth conditions. Existence results for elliptic systems with non-standard growth conditions and with a nontrivial nonlinear r.h.s. and the first ever results for parabolic systems with a non-standard growth conditions are given for the first time. Written for advanced graduate students, as well as for researchers in the field, the discussion of both the modeling and the mathematics is self-contained.
A catastrophe theory model of the conflict helix, with tests.
Rummel, R J
1987-10-01
Macro social field theory has undergone extensive development and testing since the 1960s. One of these has been the articulation of an appropriate conceptual micro model--called the conflict helix--for understanding the process from conflict to cooperation and vice versa. Conflict and cooperation are viewed as distinct equilibria of forces in a social field; the movement between these equilibria is a jump, energized by a gap between social expectations and power, and triggered by some minor event. Quite independently, there also has been much recent application of catastrophe theory to social behavior, but usually without a clear substantive theory and lacking empirical testing. This paper uses catastrophe theory--namely, the butterfly model--mathematically to structure the conflict helix. The social field framework and helix provide the substantive interpretation for the catastrophe theory; and catastrophe theory provides a suitable mathematical model for the conflict helix. The model is tested on the annual conflict and cooperation between India and Pakistan, 1948 to 1973. The results are generally positive and encouraging.
2013-01-01
Background In 2005, the International Patient Decision Aids Standards Collaboration identified twelve quality dimensions to guide assessment of patient decision aids. One dimension—the delivery of patient decision aids on the Internet—is relevant when the Internet is used to provide some or all components of a patient decision aid. Building on the original background chapter, this paper provides an updated definition for this dimension, outlines a theoretical rationale, describes current evidence, and discusses emerging research areas. Methods An international, multidisciplinary panel of authors examined the relevant theoretical literature and empirical evidence through 2012. Results The updated definition distinguishes Internet-delivery of patient decision aids from online health information and clinical practice guidelines. Theories in cognitive psychology, decision psychology, communication, and education support the value of Internet features for providing interactive information and deliberative support. Dissemination and implementation theories support Internet-delivery for providing the right information (rapidly updated), to the right person (tailored), at the right time (the appropriate point in the decision making process). Additional efforts are needed to integrate the theoretical rationale and empirical evidence from health technology perspectives, such as consumer health informatics, user experience design, and human-computer interaction. Despite Internet usage ranging from 74% to 85% in developed countries and 80% of users searching for health information, it is unknown how many individuals specifically seek patient decision aids on the Internet. Among the 86 randomized controlled trials in the 2011 Cochrane Collaboration’s review of patient decision aids, only four studies focused on Internet-delivery. Given the limited number of published studies, this paper particularly focused on identifying gaps in the empirical evidence base and
Directory of Open Access Journals (Sweden)
Nasser Al-Horais
2012-11-01
Full Text Available Normal 0 21 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:auto; mso-para-margin-right:1.0cm; mso-para-margin-bottom:auto; mso-para-margin-left:2.0cm; text-align:justify; text-indent:-1.0cm; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi; mso-ansi-language:EN-US; mso-fareast-language:EN-US;} The Minimalist Program is a major line of inquiry that has been developing inside Generative Grammar since the early nineties, when it was proposed by Chomsky (1993, 1995. In that time, Chomsky (1998: 5 presents Minimalist Program as a program, not as a theory, but today, Minimalist Program lays out a very specific view of the basis of syntactic grammar that, when compared to other formalisms, is often taken to look very much like a theory. The prime concern of this paper, however, is to provide a comprehensive and accessible introduction to the art of minimalist approach to the theory of grammar. In this regard, this paper discusses some new ideas articulated recently by Chomsky, and have led to several fundamental improvements in syntactic theory such as changing the function of movement and the Extended Projection Principle (EPP feature, or proposing new theories such as Phases and Feature Inheritance. In order to evidence the significance of these fundamental improvements, the current paper provides a minimalist analysis to account for agreement and word-order asymmetry in Stranded Arabic. This fresh minimalist account meets the challenges (to the basic tenets of syntactic theory occurred
Theory of mind impairment in right hemisphere damage: A review of the evidence.
Weed, Ethan
2008-01-01
Among the hypothesized causes of communication impairments in people with damage to the right cerebral hemisphere (RHD) is an underlying impairment in Theory of Mind (ToM) (the ability to make inferences about other peoples' mental states). In this review, evidence is considered for a ToM impairment in adults with RHD by approaching the issue from two directions. First, indirect evidence for impairment of ToM is reviewed by looking at studies on the effects of RHD on the comprehension of indirect requests, that is, requests in which the speaker's wishes are not explicitly stated but must be inferred by the listener. Second, studies that directly investigate the effects of RHD on performance on tasks intended to tap ToM are reviewed. On the basis of the papers reviewed here, it is concluded that although people with RHD do show impairments on a variety of tasks that are thought to involve ToM cognition, evidence for a specific ToM impairment is still inconclusive. It is recommended that future studies take care to distinguish individual differences in participants' linguistic production and lesion location, that more care is taken to control for task difficulty, and that well-controlled studies are combined with more naturalistic, ecologically valid tasks.
Theory and Model for Martensitic Transformations
DEFF Research Database (Denmark)
Lindgård, Per-Anker; Mouritsen, Ole G.
1986-01-01
Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...
Markov models of aging: theory and practice.
Steinsaltz, David; Mohan, Gurjinder; Kolb, Martin
2012-10-01
We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging. Copyright © 2012 Elsevier Inc. All rights reserved.
Study on Strand Space Model Theory
Institute of Scientific and Technical Information of China (English)
JI QingGuang(季庆光); QING SiHan(卿斯汉); ZHOU YongBin(周永彬); FENG DengGuo(冯登国)
2003-01-01
The growing interest in the application of formal methods of cryptographic pro-tocol analysis has led to the development of a number of different ways for analyzing protocol. Inthis paper, it is strictly proved that if for any strand, there exists at least one bundle containingit, then an entity authentication protocol is secure in strand space model (SSM) with some smallextensions. Unfortunately, the results of attack scenario demonstrate that this protocol and the Yahalom protocol and its modification are de facto insecure. By analyzing the reasons of failure offormal inference in strand space model, some deficiencies in original SSM are pointed out. In orderto break through these limitations of analytic capability of SSM, the generalized strand space model(GSSM) induced by some protocol is proposed. In this model, some new classes of strands, oraclestrands, high order oracle strands etc., are developed, and some notions are formalized strictly in GSSM, such as protocol attacks, valid protocol run and successful protocol run. GSSM can thenbe used to further analyze the entity authentication protocol. This analysis sheds light on why thisprotocol would be vulnerable while it illustrates that GSSM not only can prove security protocolcorrect, but also can be efficiently used to construct protocol attacks. It is also pointed out thatusing other protocol to attack some given protocol is essentially the same as the case of using themost of protocol itself.
Modeling Environmental Concern: Theory and Application.
Hackett, Paul M. W.
1993-01-01
Human concern for the quality and protection of the natural environment forms the basis of successful environmental conservation activities. Considers environmental concern research and proposes a model that incorporates the multiple dimensions of research through which environmental concern may be evaluated. (MDH)
L∞-algebra models and higher Chern-Simons theories
Ritter, Patricia; Sämann, Christian
2016-10-01
We continue our study of zero-dimensional field theories in which the fields take values in a strong homotopy Lie algebra. In the first part, we review in detail how higher Chern-Simons theories arise in the AKSZ-formalism. These theories form a universal starting point for the construction of L∞-algebra models. We then show how to describe superconformal field theories and how to perform dimensional reductions in this context. In the second part, we demonstrate that Nambu-Poisson and multisymplectic manifolds are closely related via their Heisenberg algebras. As a byproduct of our discussion, we find central Lie p-algebra extensions of 𝔰𝔬(p + 2). Finally, we study a number of L∞-algebra models which are physically interesting and which exhibit quantized multisymplectic manifolds as vacuum solutions.
Rock mechanics modeling based on soft granulation theory
Owladeghaffari, H
2008-01-01
This paper describes application of information granulation theory, on the design of rock engineering flowcharts. Firstly, an overall flowchart, based on information granulation theory has been highlighted. Information granulation theory, in crisp (non-fuzzy) or fuzzy format, can take into account engineering experiences (especially in fuzzy shape-incomplete information or superfluous), or engineering judgments, in each step of designing procedure, while the suitable instruments modeling are employed. In this manner and to extension of soft modeling instruments, using three combinations of Self Organizing Map (SOM), Neuro-Fuzzy Inference System (NFIS), and Rough Set Theory (RST) crisp and fuzzy granules, from monitored data sets are obtained. The main underlined core of our algorithms are balancing of crisp(rough or non-fuzzy) granules and sub fuzzy granules, within non fuzzy information (initial granulation) upon the open-close iterations. Using different criteria on balancing best granules (information pock...
Observations, Modeling and Theory of Debris Disks
Matthews, Brenda C; Wyatt, Mark C; Bryden, Geoff; Eiroa, Carlos
2014-01-01
Main sequence stars, like the Sun, are often found to be orbited by circumstellar material that can be categorized into two groups, planets and debris. The latter is made up of asteroids and comets, as well as the dust and gas derived from them, which makes debris disks observable in thermal emission or scattered light. These disks may persist over Gyrs through steady-state evolution and/or may also experience sporadic stirring and major collisional breakups, rendering them atypically bright for brief periods of time. Most interestingly, they provide direct evidence that the physical processes (whatever they may be) that act to build large oligarchs from micron-sized dust grains in protoplanetary disks have been successful in a given system, at least to the extent of building up a significant planetesimal population comparable to that seen in the Solar System's asteroid and Kuiper belts. Such systems are prime candidates to host even larger planetary bodies as well. The recent growth in interest in debris dis...
Automated Physico-Chemical Cell Model Development through Information Theory
Energy Technology Data Exchange (ETDEWEB)
Peter J. Ortoleva
2005-11-29
The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.
From integrable models to gauge theories Festschrift Matinyan (Sergei G)
Gurzadyan, V G
2002-01-01
This collection of twenty articles in honor of the noted physicist and mentor Sergei Matinyan focuses on topics that are of fundamental importance to high-energy physics, field theory and cosmology. The topics range from integrable quantum field theories, three-dimensional Ising models, parton models and tests of the Standard Model, to black holes in loop quantum gravity, the cosmological constant and magnetic fields in cosmology. A pedagogical essay by Lev Okun concentrates on the problem of fundamental units. The articles have been written by well-known experts and are addressed to graduate
Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang
2014-12-01
Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.
Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang
2014-12-01
Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.
Evidence Feed Forward Hidden Markov Model: A New Type Of Hidden Markov Model
Directory of Open Access Journals (Sweden)
Michael Del Rose
2011-01-01
Full Text Available The ability to predict the intentions of people based solely on their visual actions is a skill only performed by humans and animals. The intelligence of current computer algorithms has not reached this level of complexity, but there are several research efforts that are working towards it. With the number of classification algorithms available, it is hard to determine which algorithm works best for a particular situation. In classification of visual human intent data, Hidden Markov Models (HMM, and their variants, are leading candidates. The inability of HMMs to provide a probability in the observation to observation linkages is a big downfall in this classification technique. If a person is visually identifying an action of another person, they monitor patterns in the observations. By estimating the next observation, people have the ability to summarize the actions, and thus determine, with pretty good accuracy, the intention of the person performing the action. These visual cues and linkages are important in creating intelligent algorithms for determining human actions based on visual observations. The Evidence Feed Forward Hidden Markov Model is a newly developed algorithm which provides observation to observation linkages. The following research addresses the theory behind Evidence Feed Forward HMMs, provides mathematical proofs of their learning of these parameters to optimize the likelihood of observations with a Evidence Feed Forwards HMM, which is important in all computational intelligence algorithm, and gives comparative examples with standard HMMs in classification of both visual action data and measurement data; thus providing a strong base for Evidence Feed Forward HMMs in classification of many types of problems.
Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets
Cifter, Atilla
2011-06-01
This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.
Spectral and scattering theory for translation invariant models in quantum field theory
DEFF Research Database (Denmark)
Rasmussen, Morten Grud
This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... of the essential energy-momentum spectrum and either the two-body threshold, if there are no exited isolated mass shells, or the one-body threshold pertaining to the first exited isolated mass shell, if it exists. For the model restricted to the vacuum and one-particle sectors, the absence of singular continuous...... spectrum is proven to hold globally and scattering theory of the model is studied using time-dependent methods, of which the main result is asymptotic completeness....
Chiral field theories as models for hadron substructure
Energy Technology Data Exchange (ETDEWEB)
Kahana, S.H.
1987-03-01
A model for the nucleon as soliton of quarks interacting with classical meson fields is described. The theory, based on the linear sigma model, is renormalizable and capable of including sea quarks straightforwardly. Application to nuclear matter is made in a Wigner-Seitz approximation.
Pilot evaluation in TENCompetence: a theory-driven model1
J. Schoonenboom; H. Sligte; A. Moghnieh; M. Specht; C. Glahn; K. Stefanov
2008-01-01
This paper describes a theory-driven evaluation model that is used in evaluating four pilots in which an infrastructure for lifelong competence development, which is currently being developed, is validated. The model makes visible the separate implementation steps that connect the envisaged infrastr
Reciprocal Ontological Models Show Indeterminism Comparable to Quantum Theory
Bandyopadhyay, Somshubhro; Banik, Manik; Bhattacharya, Some Sankar; Ghosh, Sibasish; Kar, Guruprasad; Mukherjee, Amit; Roy, Arup
2016-12-01
We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.
Reciprocal Ontological Models Show Indeterminism Comparable to Quantum Theory
Bandyopadhyay, Somshubhro; Banik, Manik; Bhattacharya, Some Sankar; Ghosh, Sibasish; Kar, Guruprasad; Mukherjee, Amit; Roy, Arup
2017-02-01
We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.
Kinetic theories for spin models for cooperative relaxation dynamics
Pitts, Steven Jerome
The facilitated kinetic Ising models with asymmetric spin flip constraints introduced by Jackle and co-workers [J. Jackle, S. Eisinger, Z. Phys. B 84, 115 (1991); J. Reiter, F. Mauch, J. Jackle, Physica A 184, 458 (1992)] exhibit complex relaxation behavior in their associated spin density time correlation functions. This includes the growth of relaxation times over many orders of magnitude when the thermodynamic control parameter is varied, and, in some cases, ergodic-nonergodic transitions. Relaxation equations for the time dependence of the spin density autocorrelation function for a set of these models are developed that relate this autocorrelation function to the irreducible memory function of Kawasaki [K. Kawasaki, Physica A 215, 61 (1995)] using a novel diagrammatic series approach. It is shown that the irreducible memory function in a theory of the relaxation of an autocorrelation function in a Markov model with detailed balance plays the same role as the part of the memory function approximated by a polynomial function of the autocorrelation function with positive coefficients in schematic simple mode coupling theories for supercooled liquids [W. Gotze, in Liquids, Freezing and the Glass Transition, D. Levesque, J. P. Hansen, J. Zinn-Justin eds., 287 (North Holland, New York, 1991)]. Sets of diagrams in the series for the irreducible memory function are summed which lead to approximations of this type. The behavior of these approximations is compared with known results from previous analytical calculations and from numerical simulations. For the simplest one dimensional model, relaxation equations that are closely related to schematic extended mode coupling theories [W. Gotze, ibid] are also derived using the diagrammatic series. Comparison of the results of these approximate theories with simulation data shows that these theories improve significantly on the results of the theories of the simple schematic mode coupling theory type. The potential
The origin of discrete symmetries in F-theory models
2015-01-01
While non-abelian groups are undoubtedly the cornerstone of Grand Unified Theories (GUTs), phenomenology shows that the role of abelian and discrete symmetries is equally important in model building. The latter are the appropriate tool to suppress undesired proton decay operators and various flavour violating interactions, to generate a hierarchical fermion mass spectrum, etc. In F-theory, GUT symmetries are linked to the singularities of the elliptically fibred K3 manifolds; they are of ADE ...
Lenses on reading an introduction to theories and models
Tracey, Diane H
2017-01-01
Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a
Fuzzy Stochastic Optimization Theory, Models and Applications
Wang, Shuming
2012-01-01
Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies. The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...
Nonlinear model predictive control theory and algorithms
Grüne, Lars
2017-01-01
This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...
Computational hemodynamics theory, modelling and applications
Tu, Jiyuan; Wong, Kelvin Kian Loong
2015-01-01
This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system. Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...
Theory and Model of Agricultural Insurance Subsidy
Institute of Scientific and Technical Information of China (English)
Wan Kailiang; Long Wenjun
2007-01-01
The issue of agricultural insurance subsidy is discussed in this paper aiming to make it provided more rationally and scientifically.It is started with the connection between agricultural insurance and financial subsidy.It is really necessary and crucial to implement the financial insurance due to the bad operational performance,especially in the developing countries.But the subsidy should be provided more rationally because financial subsidy has lots of negative effects.A model in competitive insurance markets developed by Ahsan et al(1982)and a farmers'decision model arc developed to solve the optimal subsidized rate.Finally,the equation is got to calculate it.But a quantitative subsidized rate is not made here because the calculation should be under some restricted conditions,which are always absent in the developing countries.So the government should provide some subsidy for the ex ante research and preparation to get the scientific probability and premium rate.
Bayesian model evidence for order selection and correlation testing.
Johnston, Leigh A; Mareels, Iven M Y; Egan, Gary F
2011-01-01
Model selection is a critical component of data analysis procedures, and is particularly difficult for small numbers of observations such as is typical of functional MRI datasets. In this paper we derive two Bayesian evidence-based model selection procedures that exploit the existence of an analytic form for the linear Gaussian model class. Firstly, an evidence information criterion is proposed as a model order selection procedure for auto-regressive models, outperforming the commonly employed Akaike and Bayesian information criteria in simulated data. Secondly, an evidence-based method for testing change in linear correlation between datasets is proposed, which is demonstrated to outperform both the traditional statistical test of the null hypothesis of no correlation change and the likelihood ratio test.
Network Data: Statistical Theory and New Models
2016-02-17
Using AERONET DRAGON Campaign Data, IEEE Transactions on Geoscience and Remote Sensing, (08 2015): 0. doi: 10.1109/TGRS.2015.2395722 Geoffrey...are not viable, i.e. the fruit fly dies after the knock-out of the gene. Further examination of the ftz stained embryos indicates that the lack of...our approach for spatial gene expression analysis for early stage fruit fly embryos, we are in a process to extend it to model later stage gene
Debris Discs: Modeling/theory review
Thébault, P.
2012-03-01
An impressive amount of photometric, spectroscopic and imaging observations of circumstellar debris discs has been accumulated over the past 3 decades, revealing that they come in all shapes and flavours, from young post-planet-formation systems like Beta-Pic to much older ones like Vega. What we see in these systems are small grains, which are probably only the tip of the iceberg of a vast population of larger (undetectable) collisionally-eroding bodies, leftover from the planet-formation process. Understanding the spatial structure, physical properties, origin and evolution of this dust is of crucial importance, as it is our only window into what is going on in these systems. Dust can be used as a tracer of the distribution of their collisional progenitors and of possible hidden massive pertubers, but can also allow to derive valuable information about the disc's total mass, size distribution or chemical composition. I will review the state of the art in numerical models of debris disc, and present some important issues that are explored by current modelling efforts: planet-disc interactions, link between cold (i.e. Herschel-observed) and hot discs, effect of binarity, transient versus continuous processes, etc. I will finally present some possible perspectives for the development of future models.
A class of effective field theory models of cosmic acceleration
Energy Technology Data Exchange (ETDEWEB)
Bloomfield, Jolyon K.; Flanagan, Éanna É., E-mail: jkb84@cornell.edu, E-mail: eef3@cornell.edu [Center for Radiophysics and Space Research, Cornell University, Space Science Building, Ithaca, NY 14853 (United States)
2012-10-01
We explore a class of effective field theory models of cosmic acceleration involving a metric and a single scalar field. These models can be obtained by starting with a set of ultralight pseudo-Nambu-Goldstone bosons whose couplings to matter satisfy the weak equivalence principle, assuming that one boson is lighter than all the others, and integrating out the heavier fields. The result is a quintessence model with matter coupling, together with a series of correction terms in the action in a covariant derivative expansion, with specific scalings for the coefficients. After eliminating higher derivative terms and exploiting the field redefinition freedom, we show that the resulting theory contains nine independent free functions of the scalar field when truncated at four derivatives. This is in contrast to the four free functions found in similar theories of single-field inflation, where matter is not present. We discuss several different representations of the theory that can be obtained using the field redefinition freedom. For perturbations to the quintessence field today on subhorizon lengthscales larger than the Compton wavelength of the heavy fields, the theory is weakly coupled and natural in the sense of t'Hooft. The theory admits a regime where the perturbations become modestly nonlinear, but very strong nonlinearities lie outside its domain of validity.
Nanofluid Drop Evaporation: Experiment, Theory, and Modeling
Gerken, William James
Nanofluids, stable colloidal suspensions of nanoparticles in a base fluid, have potential applications in the heat transfer, combustion and propulsion, manufacturing, and medical fields. Experiments were conducted to determine the evaporation rate of room temperature, millimeter-sized pendant drops of ethanol laden with varying amounts (0-3% by weight) of 40-60 nm aluminum nanoparticles (nAl). Time-resolved high-resolution drop images were collected for the determination of early-time evaporation rate (D2/D 02 > 0.75), shown to exhibit D-square law behavior, and surface tension. Results show an asymptotic decrease in pendant drop evaporation rate with increasing nAl loading. The evaporation rate decreases by approximately 15% at around 1% to 3% nAl loading relative to the evaporation rate of pure ethanol. Surface tension was observed to be unaffected by nAl loading up to 3% by weight. A model was developed to describe the evaporation of the nanofluid pendant drops based on D-square law analysis for the gas domain and a description of the reduction in liquid fraction available for evaporation due to nanoparticle agglomerate packing near the evaporating drop surface. Model predictions are in relatively good agreement with experiment, within a few percent of measured nanofluid pendant drop evaporation rate. The evaporation of pinned nanofluid sessile drops was also considered via modeling. It was found that the same mechanism for nanofluid evaporation rate reduction used to explain pendant drops could be used for sessile drops. That mechanism is a reduction in evaporation rate due to a reduction in available ethanol for evaporation at the drop surface caused by the packing of nanoparticle agglomerates near the drop surface. Comparisons of the present modeling predictions with sessile drop evaporation rate measurements reported for nAl/ethanol nanofluids by Sefiane and Bennacer [11] are in fairly good agreement. Portions of this abstract previously appeared as: W. J
Invisible 'glue' bosons in model field theory
Shirokov, M I
2002-01-01
Fermionic psi(x) and bosonic phi(x) fields with vector coupling are discussed. It is shown that 'clothed' bosons of the model do not interact with fermions and between themselves. If phi(x) does not interact with other fields of the particle physics, then the 'clothed' bosons have properties of the cosmological 'dark' matter': they cannot be detected in Earth's laboratories. This cause of the boson invisibility contrasts with the origin of the unobservability of the isolated gluons in QCD which is explained by the confinement of colour
Inhibitory processes and cognitive flexibility: evidence for the theory of attentional inertia
Directory of Open Access Journals (Sweden)
Isabel Introzzi
2015-07-01
Full Text Available The aim of this study was to discriminate the differential contribution of different inhibitory processes -perceptual, cognitive and behavioral inhibition- to switching cost effect associated with alternation cognitive tasks. A correlational design was used. Several experimental paradigms (e.g., Stop signal, visual search, Stemberg´s experimental and Simon paradigm were adapted and included in a computerized program called TAC (Introzzi & Canet Juric, 2014 to the assessment of the different cognitive processes. The final sample consisted of 45 adults (18-50 years. Perceptual and behavioral inhibition shows moderate and low correlations with attentional cost, cognitive inhibition shows no relation with flexibility and only perceptual inhibition predicts switching costs effects, suggesting that different inhibitory processes contribute differentially to switch cost. This could be interpreted as evidence to Attentional Inertia Theory main argument which postulates that inhibition plays an essential role in the ability to flexibly switch between tasks and/or representations.
Chemolli, Emanuela; Gagné, Marylène
2014-06-01
Self-determination theory (SDT) proposes a multidimensional conceptualization of motivation in which the different regulations are said to fall along a continuum of self-determination. The continuum has been used as a basis for using a relative autonomy index as a means to create motivational scores. Rasch analysis was used to verify the continuum structure of the Multidimensional Work Motivation Scale and of the Academic Motivation Scale. We discuss the concept of continuum against SDT's conceptualization of motivation and argue against the use of the relative autonomy index on the grounds that evidence for a continuum structure underlying the regulations is weak and because the index is statistically problematic. We suggest exploiting the full richness of SDT's multidimensional conceptualization of motivation through the use of alternative scoring methods when investigating motivational dynamics across life domains.
Using Both a Probabilistic Evolutionary Graph and the Evidence Theory for Color Scene Analysis
Directory of Open Access Journals (Sweden)
Nassim Ammour
2008-01-01
Full Text Available In this research, we introduce a new color images segmentation algorithm. The color scene analytic method is based on the progress of a probabilistic evolutionary graph. The strategy consists in making grow an evolutionary graph, which presents the scene elements in an unsupervised segmented image. The graph evolution development is based on the computation of the belonging probabilities to the existing classes of the last built region. The space composition matrix of the areas in each class is then given. A space delimitation map of the regions is established by a new method of contour localization and refinement. At last, the final segmented image is established by classification of the pixels in the conflict region using the Dempster-Shafer evidence theory. The effectiveness of the method is demonstrated on real images.
Pan, Dongbo; Lu, Xi; Liu, Juan; Deng, Yong
2014-01-01
Decision-making, as a way to discover the preference of ranking, has been used in various fields. However, owing to the uncertainty in group decision-making, how to rank alternatives by incomplete pairwise comparisons has become an open issue. In this paper, an improved method is proposed for ranking of alternatives by incomplete pairwise comparisons using Dempster-Shafer evidence theory and information entropy. Firstly, taking the probability assignment of the chosen preference into consideration, the comparison of alternatives to each group is addressed. Experiments verified that the information entropy of the data itself can determine the different weight of each group's choices objectively. Numerical examples in group decision-making environments are used to test the effectiveness of the proposed method. Moreover, the divergence of ranking mechanism is analyzed briefly in conclusion section. PMID:25250393
Directory of Open Access Journals (Sweden)
Francesca Zazzara
2013-11-01
Full Text Available On June 3rd 2013, in Turin, Italy, the Swiss industrialist Schmidheiny has been sentenced to 18 years imprisonment for intentional disaster for 3,000 asbestos-linked tumours in Italian workers at cement multinational Eternit. The indiscriminate use of asbestos, however, continues worldwide. Although many studies have shown that asbestos is associated with an increased risk of mortality and morbidity, denial theories were spread over time, showing how the logic of profit governs the production of asbestos. We examined the history of the epidemiological evidence of asbestos related risks and, second, the main sources of exposure in Italy and in the world, occupational, non-occupational, and post-disaster exposure (as occurred after L’Aquila earthquake in April 2009. The theme of inequality and social justice is ever so alarming in the fight against asbestos and its lobbies.
Evidence for the multiple hits genetic theory for inherited language impairment: a case study
Directory of Open Access Journals (Sweden)
Tracy M Centanni
2015-08-01
Full Text Available Communication disorders have complex genetic origins, with constellations of relevant gene markers that vary across individuals. Some genetic variants are present in healthy individuals as well as those affected by developmental disorders. Growing evidence suggests that some variants may increase susceptibility to these disorders in the presence of other pathogenic gene mutations. In the current study, we describe eight children with specific language impairment and four of these children had a copy number variant in one of these potential susceptibility regions on chromosome 15. Three of these four children also had variants in other genes previously associated with language impairment. Our data support the theory that 15q11.2 is a susceptibility region for developmental disorders, specifically language impairment.
Flipped classroom model for learning evidence-based medicine.
Rucker, Sydney Y; Ozdogan, Zulfukar; Al Achkar, Morhaf
2017-01-01
Journal club (JC), as a pedagogical strategy, has long been used in graduate medical education (GME). As evidence-based medicine (EBM) becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice.
Flipped classroom model for learning evidence-based medicine
Rucker, Sydney Y; Ozdogan, Zulfukar; Al Achkar, Morhaf
2017-01-01
Journal club (JC), as a pedagogical strategy, has long been used in graduate medical education (GME). As evidence-based medicine (EBM) becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice. PMID:28919831
Hannah, David R.; Venkatachary, Ranga
2010-01-01
In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…
Bouncing Model in Brane World Theory
Maier, Rodrigo; Soares, Ivano Damião
2013-01-01
We examine the nonlinear dynamics of a closed Friedmann-Robertson-Walker universe in the framework of Brane World formalism with a timelike extra dimension. In this scenario, the Friedmann equations contain additional terms arising from the bulk-brane interaction which provide a concrete model for nonsingular bounces in the early phase of the Universe. We construct a nonsingular cosmological scenario sourced with dust, radiation and a cosmological constant. The structure of the phase space shows a nonsingular orbit with two accelerated phases, separated by a smooth transition corresponding to a decelerated expansion. Given observational parameters we connect such phases to a primordial accelerated phase, a soft transition to Friedmann (where the classical regime is valid), and a graceful exit to a de Sitter accelerated phase.
An Abstraction Theory for Qualitative Models of Biological Systems
Banks, Richard; 10.4204/EPTCS.40.3
2010-01-01
Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well-known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automation of this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis-lysogeny switch in the bacteriophage lambda.
Summary of papers presented in the Theory and Modelling session
Directory of Open Access Journals (Sweden)
Lin-Liu Y.R.
2012-09-01
Full Text Available A total of 14 contributions were presented in the Theory and Modelling sessions at EC-17. One Theory and Modelling paper was included in the ITER ECRH and ECE sessions each. Three papers were in the area of nonlinear physics discussing parametric processes accompanying ECRH. Eight papers were based on the quasi-linear theory of wave heating and current drive. Three of these addressed the application of ECCD for NTM stabilization. Two papers considered scattering of EC waves by edge density fluctuations and related phenomena. In this summary, we briefly describe the highlights of these contributions. Finally, the three papers concerning modelling of various aspects of ECE are reported in the ECE session.
Modeling Reusable and Interoperable Faceted Browsing Systems with Category Theory.
Harris, Daniel R
2015-08-01
Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and lightweight ontologies, but in many regards, they are implementations of faceted browsing rather than a specification of the basic, underlying structures and interactions. We will demonstrate that category theory allows us to specify faceted objects and study the relationships and interactions within a faceted browsing system. Implementations can then be constructed through a category-theoretic lens using these models, allowing abstract comparison and communication that naturally support interoperability and reuse.
Foundations of reusable and interoperable facet models using category theory.
Harris, Daniel R
2016-10-01
Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight ontologies, but in many regards, they are implementations of faceted browsing rather than a specification of the basic, underlying structures and interactions. We will demonstrate that category theory allows us to specify faceted objects and study the relationships and interactions within a faceted browsing system. Resulting implementations can then be constructed through a category-theoretic lens using these models, allowing abstract comparison and communication that naturally support interoperability and reuse.
M-theory model-building and proton stability
Energy Technology Data Exchange (ETDEWEB)
Ellis, J. [CERN, Geneva (Switzerland). Theory Div.; Faraggi, A.E. [Florida Univ., Gainesville, FL (United States). Inst. for Fundamental Theory; Nanopoulos, D.V. [Texas A and M Univ., College Station, TX (United States)]|[Houston Advanced Research Center, The Woodlands, TX (United States). Astroparticle Physics Group]|[Academy of Athens (Greece). Div. of Natural Sciences
1997-09-01
The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z{sub 2} x Z{sub 2} orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.
Adapting evidence-based interventions using a common theory, practices, and principles.
Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D
2014-01-01
Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed.
Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping.
Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E
2014-04-01
Fear arousal-vividly showing people the negative health consequences of life-endangering behaviors-is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. © 2013 The Authors. International Journal of Psychology published by John Wiley © Sons Ltd on behalf of International Union of Psychological Science.
Leibovich, Tali; Ansari, Daniel
2016-03-01
How do numerical symbols, such as number words, acquire semantic meaning? This question, also referred to as the "symbol-grounding problem," is a central problem in the field of numerical cognition. Present theories suggest that symbols acquire their meaning by being mapped onto an approximate system for the nonsymbolic representation of number (Approximate Number System or ANS). In the present literature review, we first asked to which extent current behavioural and neuroimaging data support this theory, and second, to which extent the ANS, upon which symbolic numbers are assumed to be grounded, is numerical in nature. We conclude that (a) current evidence that has examined the association between the ANS and number symbols does not support the notion that number symbols are grounded in the ANS and (b) given the strong correlation between numerosity and continuous variables in nonsymbolic number processing tasks, it is next to impossible to measure the pure association between symbolic and nonsymbolic numerosity. Instead, it is clear that significant cognitive control resources are required to disambiguate numerical from continuous variables during nonsymbolic number processing. Thus, if there exists any mapping between the ANS and symbolic number, then this process of association must be mediated by cognitive control. Taken together, we suggest that studying the role of both cognitive control and continuous variables in numerosity comparison tasks will provide a more complete picture of the symbol-grounding problem.
A new non-specificity measure in evidence theory based on belief intervals
Directory of Open Access Journals (Sweden)
Yang Yi
2016-06-01
Full Text Available In the theory of belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity. For the non-specificity part, some traditional measures use for reference the Hartley measure in classical set theory; other traditional measures use the simple and heuristic function for joint use of mass assignments and the cardinality of focal elements. In this paper, a new non-specificity measure is proposed using lengths of belief intervals, which represent the degree of imprecision. Therefore, it has more intuitive physical meaning. It can be proved that our new measure can be rewritten in a general form for the non-specificity. Our new measure is also proved to be a strict non-specificity measure with some desired properties. Numerical examples, simulations, the related analyses and proofs are provided to show the characteristics and good properties of the new non-specificity definition. An example of an application of the new non-specificity measure is also presented.
A new non-specificity measure in evidence theory based on belief intervals
Institute of Scientific and Technical Information of China (English)
Yang Yi; Han Deqiang; Jean Dezert
2016-01-01
In the theory of belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity. For the non-specificity part, some traditional measures use for reference the Hartley measure in classical set theory;other traditional measures use the simple and heuristic function for joint use of mass assignments and the cardinality of focal elements. In this paper, a new non-specificity measure is proposed using lengths of belief intervals, which represent the degree of imprecision. Therefore, it has more intuitive physical meaning. It can be proved that our new measure can be rewritten in a general form for the non-specificity. Our new measure is also proved to be a strict non-specificity measure with some desired properties. Numerical examples, simulations, the related analyses and proofs are provided to show the characteristics and good properties of the new non-specificity definition. An example of an application of the new non-specificity measure is also presented.
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Direct evidence for a Coulombic phase in monopole-suppressed SU(2) lattice gauge theory
Energy Technology Data Exchange (ETDEWEB)
Grady, Michael, E-mail: grady@fredonia.edu
2013-11-21
Further evidence is presented for the existence of a non-confining phase at weak coupling in SU(2) lattice gauge theory. Using Monte Carlo simulations with the standard Wilson action, gauge-invariant SO(3)–Z2 monopoles, which are strong-coupling lattice artifacts, have been seen to undergo a percolation transition exactly at the phase transition previously seen using Coulomb gauge methods, with an infinite lattice critical point near β=3.2. The theory with both Z2 vortices and monopoles and SO(3)–Z2 monopoles eliminated is simulated in the strong-coupling (β=0) limit on lattices up to 60{sup 4}. Here, as in the high-β phase of the Wilson-action theory, finite size scaling shows it spontaneously breaks the remnant symmetry left over after Coulomb gauge fixing. Such a symmetry breaking precludes the potential from having a linear term. The monopole restriction appears to prevent the transition to a confining phase at any β. Direct measurement of the instantaneous Coulomb potential shows a Coulombic form with moderately running coupling possibly approaching an infrared fixed point of α∼1.4. The Coulomb potential is measured to 50 lattice spacings and 2 fm. A short-distance fit to the 2-loop perturbative potential is used to set the scale. High precision at such long distances is made possible through the use of open boundary conditions, which was previously found to cut random and systematic errors of the Coulomb gauge fixing procedure dramatically. The Coulomb potential agrees with the gauge-invariant interquark potential measured with smeared Wilson loops on periodic lattices as far as the latter can be practically measured with similar statistics data.
Direct evidence for a Coulombic phase in monopole-suppressed SU(2) lattice gauge theory
Grady, Michael
2013-11-01
Further evidence is presented for the existence of a non-confining phase at weak coupling in SU(2) lattice gauge theory. Using Monte Carlo simulations with the standard Wilson action, gauge-invariant SO(3)-Z2 monopoles, which are strong-coupling lattice artifacts, have been seen to undergo a percolation transition exactly at the phase transition previously seen using Coulomb gauge methods, with an infinite lattice critical point near β=3.2. The theory with both Z2 vortices and monopoles and SO(3)-Z2 monopoles eliminated is simulated in the strong-coupling (β=0) limit on lattices up to 604. Here, as in the high-β phase of the Wilson-action theory, finite size scaling shows it spontaneously breaks the remnant symmetry left over after Coulomb gauge fixing. Such a symmetry breaking precludes the potential from having a linear term. The monopole restriction appears to prevent the transition to a confining phase at any β. Direct measurement of the instantaneous Coulomb potential shows a Coulombic form with moderately running coupling possibly approaching an infrared fixed point of α˜1.4. The Coulomb potential is measured to 50 lattice spacings and 2 fm. A short-distance fit to the 2-loop perturbative potential is used to set the scale. High precision at such long distances is made possible through the use of open boundary conditions, which was previously found to cut random and systematic errors of the Coulomb gauge fixing procedure dramatically. The Coulomb potential agrees with the gauge-invariant interquark potential measured with smeared Wilson loops on periodic lattices as far as the latter can be practically measured with similar statistics data.
Magnetized cosmological models in bimetric theory of gravitation
Indian Academy of Sciences (India)
S D Katore; R S Rane
2006-08-01
Bianchi type-III magnetized cosmological model when the field of gravitation is governed by either a perfect fluid or cosmic string is investigated in Rosen's [1] bimetric theory of gravitation. To complete determinate solution, the condition, viz., = (), where is a constant, between the metric potentials is used. We have assumed different equations of state for cosmic string [2] for the complete solution of the model. Some physical and geometrical properties of the exhibited model are discussed and studied.
Hydrodynamics Research on Amphibious Vehicle Systems:Modeling Theory
Institute of Scientific and Technical Information of China (English)
JU Nai-jun
2006-01-01
For completing the hydrodynamics software development and the engineering application research on the amphibious vehicle systems, hydrodynamic modeling theory of the amphibious vehicle systems is elaborated, which includes to build up the dynamic system model of amphibious vehicle motion on water, gun tracking-aiming-firing, bullet hit and armored check-target, gunner operating control, and the simulation computed model of time domain for random sea wave.
Dynamics in Nonlocal Cosmological Models Derived from String Field Theory
Joukovskaya, Liudmila
2007-01-01
A general class of nonlocal cosmological models is considered. A new method for solving nonlocal Friedmann equations is proposed, and solutions of the Friedmann equations with nonlocal operator are presented. The cosmological properties of these solutions are discussed. Especially indicated is $p$-adic cosmological model in which we have obtained nonsingular bouncing solution and string field theory tachyon model in which we have obtained full solution of nonlocal Friedmann equations with $w=...
A Model of Resurgence Based on Behavioral Momentum Theory
Shahan, Timothy A; Sweeney, Mary M.
2011-01-01
Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforc...
Population changes: contemporary models and theories.
Sauvy, A
1981-01-01
In many developing countries rapid population growth has promoted a renewed interest in the study of the effect of population growth on economic development. This research takes either the macroeconomic viewpoint, where the nation is the framework, or the microeconomic perspective, where the family is the framework. For expository purposes, the macroeconomic viewpoint is assumed, and an example of such an investment is presented. Attention is directed to the following: a simplified model--housing; the lessons learned from experience (primitive populations, Spain in the 17th and 18th centuries, comparing development in Spain and Italy, 19th century Western Europe, and underdeveloped countries); the positive factors of population growth; and the concept of the optimal rate of growth. Housing is the typical investment that an individual makes. Hence, the housing per person (roughly 1/3 of the necessary amount of housing per family) is taken as a unit, and the calculations are made using averages. The conclusion is that growth is expensive. A population decrease might be advantageous, for this decrease would enable the entire population to benefit from past capital accumulation. It is also believed, "a priori," that population growth is more expensive for a developed than for a developing country. This belief may be attributable to the fact that the capital per person tends to be high in the developed countries. Any further increase in the population requires additional capital investments, driving this ratio even higher. Yet, investment is not the only factor inhibiting economic development. The literature describes factors regarding population growth, yet this writer prefers to emphasize 2 other factors that have been the subject of less study: a growing population's ease of adaptation and the human factor--behavior. A growing population adapts better to new conditions than does a stationary or declining population, and contrary to "a priori" belief, a growing
Perturbation theory for string sigma models
Bianchi, Lorenzo
2016-01-01
In this thesis we investigate quantum aspects of the Green-Schwarz superstring in various AdS backgrounds relevant for the AdS/CFT correspondence, providing several examples of perturbative computations in the corresponding integrable sigma-models. We start by reviewing in details the supercoset construction of the superstring action in $AdS_5 \\times S^5$, pointing out the limits of this procedure for $AdS_4$ and $AdS_3$ backgrounds. For the $AdS_4 \\times CP^3$ case we give a thorough derivation of an alternative action, based on the double-dimensional reduction of eleven-dimensional super-membranes. We then consider the expansion about the BMN vacuum and the S-matrix for the scattering of worldsheet excitations in the decompactification limit. To evaluate its elements efficiently we describe a unitarity-based method resulting in a very compact formula yielding the cut-constructible part of any one-loop two-dimensional S-matrix. In the second part of this review we analyze the superstring action on $AdS_4 \\ti...
Lenses on Reading An Introduction to Theories and Models
Tracey, Diane H
2012-01-01
This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition
Effective Field Theory and the No-Core Shell Model
Directory of Open Access Journals (Sweden)
Stetcua I.
2010-04-01
Full Text Available In ﬁnite model space suitable for many-body calculations via the no-core shell model (NCSM, I illustrate the direct application of the eﬀective ﬁeld theory (EFT principles to solving the many-body Schrödinger equation. Two diﬀerent avenues for ﬁxing the low-energy constants naturally arising in an EFT approach are discussed. I review results for both nuclear and trapped atomic systems, using eﬀective theories formally similar, albeit describing diﬀerent underlying physics.
Consistent constraints on the Standard Model Effective Field Theory
Berthier, Laure
2015-01-01
We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, $\\Lambda \\gtrsim \\, 3 \\, {\\rm TeV}$. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an $\\rm S,T$ analysis is modified by the theory errors we include as an illustrative example.
Theory of compressive modeling and simulation
Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith
2013-05-01
Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .
A genetic program theory of aging using an RNA population model.
Wang, Xiufang; Ma, Zhihong; Cheng, Jianjun; Lv, Zhanjun
2014-01-01
Aging is a common characteristic of multicellular eukaryotes. Copious hypotheses have been proposed to explain the mechanisms of aging, but no single theory is generally acceptable. In this article, we refine the RNA population gene activating model (Lv et al., 2003) based on existing reports as well as on our own latest findings. We propose the RNA population model as a genetic theory of aging. The new model can also be applied to differentiation and tumorigenesis and could explain the biological significance of non-coding DNA, RNA, and repetitive sequence DNA. We provide evidence from the literature as well as from our own findings for the roles of repetitive sequences in gene activation. In addition, we predict several phenomena related to aging and differentiation based on this model.
Theories beyond the standard model, one year before the LHC
Dimopoulos, Savas
2006-04-01
Next year the Large Hadron Collider at CERN will begin what may well be a new golden era of particle physics. I will discuss three theories that will be tested at the LHC. I will begin with the supersymmetric standard model, proposed with Howard Georgi in 1981. This theory made a precise quantitative prediction, the unification of couplings, that has been experimentally confirmed in 1991 by experiments at CERN and SLAC. This established it as the leading theory for physics beyond the standard model. Its main prediction, the existence of supersymmetric particles, will be tested at the large hadron collider. I will next overview theories with large new dimensions, proposed with Nima Arkani-Hamed and Gia Dvali in 1998. This links the weakness of gravity to the presence of sub-millimeter size dimensions, that are presently searched for in experiments looking for deviations from Newton's law at short distances. In this framework quantum gravity, string theory, and black holes may be experimentally investigated at the large hadron collider. I will end with the recent proposal of split supersymmetry with Nima Arkani-Hamed. This theory is motivated by the possible existence of an enormous number of ground states in the fundamental theory, as suggested by the cosmological constant problem and recent developments in string theory and cosmology. It can be tested at the large hadron collider and, if confirmed, it will lend support to the idea that our universe and its laws are not unique and that there is an enormous variety of universes each with its own distinct physical laws.
Higher-Rank Supersymmetric Models and Topological Field Theory
Kawai, T; Yang, S K; Kawai, Toshiya; Uchino, Taku; Yang, Sung-Kil
1993-01-01
In the first part of this paper we investigate the operator aspect of higher-rank supersymmetric model which is introduced as a Lie theoretic extension of the $N=2$ minimal model with the simplest case $su(2)$ corresponding to the $N=2$ minimal model. In particular we identify the analogs of chirality conditions and chiral ring. In the second part we construct a class of topological conformal field theories starting with this higher-rank supersymmetric model. We show the BRST-exactness of the twisted stress-energy tensor, find out physical observables and discuss how to make their correlation functions. It is emphasized that in the case of $su(2)$ the topological field theory constructed in this paper is distinct from the one obtained by twisting the $N=2$ minimal model through the usual procedure.
Bridging emotion theory and neurobiology through dynamic systems modeling.
Lewis, Marc D
2005-04-01
Efforts to bridge emotion theory with neurobiology can be facilitated by dynamic systems (DS) modeling. DS principles stipulate higher-order wholes emerging from lower-order constituents through bidirectional causal processes--offering a common language for psychological and neurobiological models. After identifying some limitations of mainstream emotion theory, I apply DS principles to emotion-cognition relations. I then present a psychological model based on this reconceptualization, identifying trigger, self-amplification, and self-stabilization phases of emotion-appraisal states, leading to consolidating traits. The article goes on to describe neural structures and functions involved in appraisal and emotion, as well as DS mechanisms of integration by which they interact. These mechanisms include nested feedback interactions, global effects of neuromodulation, vertical integration, action-monitoring, and synaptic plasticity, and they are modeled in terms of both functional integration and temporal synchronization. I end by elaborating the psychological model of emotion-appraisal states with reference to neural processes.
A model of PCF in guarded type theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about elements...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...
An introduction to queueing theory modeling and analysis in applications
Bhat, U Narayan
2015-01-01
This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...
Cosmological Model Based on Gauge Theory of Gravity
Institute of Scientific and Technical Information of China (English)
WU Ning
2005-01-01
A cosmological model based on gauge theory of gravity is proposed in this paper. Combining cosmological principle and field equation of gravitational gauge field, dynamical equations of the scale factor R(t) of our universe can be obtained. This set of equations has three different solutions. A prediction of the present model is that, if the energy density of the universe is not zero and the universe is expanding, the universe must be space-flat, the total energy density must be the critical density ρc of the universe. For space-flat case, this model gives the same solution as that of the Friedmann model. In other words, though they have different dynamics of gravitational interactions, general relativity and gauge theory of gravity give the same cosmological model.
Structural properties of effective potential model by liquid state theories
Institute of Scientific and Technical Information of China (English)
Xiang Yuan-Tao; Andrej Jamnik; Yang Kai-Wei
2010-01-01
This paper investigates the structural properties of a model fluid dictated by an effective inter-particle oscillatory potential by grand canonical ensemble Monte Carlo (GCEMC) simulation and classical liquid state theories. The chosen oscillatory potential incorporates basic interaction terms used in modeling of various complex fluids which is composed of mesoscopic particles dispersed in a solvent bath, the studied structural properties include radial distribution function in bulk and inhomogeneous density distribution profile due to influence of several external fields. The GCEMC results are employed to test the validity of two recently proposed theoretical approaches in the field of atomic fluids. One is an Ornstein-Zernike integral equation theory approach; the other is a third order + second order perturbation density functional theory. Satisfactory agreement between the GCEMC simulation and the pure theories fully indicates the ready adaptability of the atomic fluid theories to effective model potentials in complex fluids, and classifies the proposed theoretical approaches as convenient tools for the investigation of complex fluids under the single component macro-fluid approximation.
Twisted gauge theories in 3D Walker-Wang models
Wang, Zitao
2016-01-01
Three dimensional gauge theories with a discrete gauge group can emerge from spin models as a gapped topological phase with fractional point excitations (gauge charge) and loop excitations (gauge flux). It is known that 3D gauge theories can be "twisted", in the sense that the gauge flux loops can have nontrivial braiding statistics among themselves and such twisted gauge theories are realized in models discovered by Dijkgraaf and Witten. A different framework to systematically construct three dimensional topological phases was proposed by Walker and Wang and a series of examples have been studied. Can the Walker Wang construction be used to realize the topological order in twisted gauge theories? This is not immediately clear because the Walker-Wang construction is based on a loop condensation picture while the Dijkgraaf-Witten theory is based on a membrane condensation picture. In this paper, we show that the answer to this question is Yes, by presenting an explicit construction of the Walker Wang models wh...
Theory, modeling and simulation of superconducting qubits
Energy Technology Data Exchange (ETDEWEB)
Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU
2011-01-13
We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high
Theory, modeling and simulation of superconducting qubits
Energy Technology Data Exchange (ETDEWEB)
Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU
2011-01-13
We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high
Traffic Games: Modeling Freeway Traffic with Game Theory
Cortés-Berrueco, Luis E.; Gershenson, Carlos; Stephens, Christopher R.
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers’ interactions. PMID:27855176
Traffic Games: Modeling Freeway Traffic with Game Theory.
Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.
Directory of Open Access Journals (Sweden)
Reeves Scott
2006-02-01
Full Text Available Abstract The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.
Bhattacharyya, Onil; Reeves, Scott; Garfinkel, Susan; Zwarenstein, Merrick
2006-02-23
The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG) authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.
Directory of Open Access Journals (Sweden)
Thomas Ruth
2011-05-01
Full Text Available Abstract Background Psychological models predict behaviour in a wide range of settings. The aim of this study was to explore the usefulness of a range of psychological models to predict the health professional behaviour 'referral for lumbar spine x-ray in patients presenting with low back pain' by UK primary care physicians. Methods Psychological measures were collected by postal questionnaire survey from a random sample of primary care physicians in Scotland and north England. The outcome measures were clinical behaviour (referral rates for lumbar spine x-rays, behavioural simulation (lumbar spine x-ray referral decisions based upon scenarios, and behavioural intention (general intention to refer for lumbar spine x-rays in patients with low back pain. Explanatory variables were the constructs within the Theory of Planned Behaviour (TPB, Social Cognitive Theory (SCT, Common Sense Self-Regulation Model (CS-SRM, Operant Learning Theory (OLT, Implementation Intention (II, Weinstein's Stage Model termed the Precaution Adoption Process (PAP, and knowledge. For each of the outcome measures, a generalised linear model was used to examine the predictive value of each theory individually. Linear regression was used for the intention and simulation outcomes, and negative binomial regression was used for the behaviour outcome. Following this 'theory level' analysis, a 'cross-theoretical construct' analysis was conducted to investigate the combined predictive value of all individual constructs across theories. Results Constructs from TPB, SCT, CS-SRM, and OLT predicted behaviour; however, the theoretical models did not fit the data well. When predicting behavioural simulation, the proportion of variance explained by individual theories was TPB 11.6%, SCT 12.1%, OLT 8.1%, and II 1.5% of the variance, and in the cross-theory analysis constructs from TPB, CS-SRM and II explained 16.5% of the variance in simulated behaviours. When predicting intention, the
Bridging Research, Practice, and Policy: The "Evidence Academy" Conference Model.
Rohweder, Catherine L; Laping, Jane L; Diehl, Sandra J; Moore, Alexis A; Isler, Malika Roman; Scott, Jennifer Elissa; Enga, Zoe Kaori; Black, Molly C; Dave, Gaurav; Corbie-Smith, Giselle; Melvin, Cathy L
2016-01-01
Innovative models to facilitate more rapid uptake of research findings into practice are urgently needed. Community members who engage in research can accelerate this process by acting as adoption agents. We implemented an Evidence Academy conference model bringing together researchers, health care professionals, advocates, and policy makers across North Carolina to discuss high-impact, life-saving study results. The overall goal is to develop dissemination and implementation strategies for translating evidence into practice and policy. Each 1-day, single-theme, regional meeting focuses on a leading community-identified health priority. The model capitalizes on the power of diverse local networks to encourage broad, common awareness of new research findings. Furthermore, it emphasizes critical reflection and active group discussion on how to incorporate new evidence within and across organizations, health care systems, and communities. During the concluding session, participants are asked to articulate action plans relevant to their individual interests, work setting, or area of expertise.
Matrix Factorizations for Local F-Theory Models
Omer, Harun
2016-01-01
I use matrix factorizations to describe branes at simple singularities as they appear in elliptic fibrations of local F-theory models. Each node of the corresponding Dynkin diagrams of the ADE-type singularities is associated with one indecomposable matrix factorization which can be deformed into one or more factorizations of lower rank. Branes with internal fluxes arise naturally as bound states of the indecomposable factorizations. Describing branes in such a way avoids the need to resolve singularities and encodes information which is neglected in conventional F-theory treatments. This paper aims to show how branes arising in local F-theory models around simple singularities can be described in this framework.
Tsai, Chung-Hung
2014-05-07
Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Directory of Open Access Journals (Sweden)
Chung-Hung Tsai
2014-05-01
Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Excellence in Physics Education Award: Modeling Theory for Physics Instruction
Hestenes, David
2014-03-01
All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.
AIC, BIC, Bayesian evidence against the interacting dark energy model
Szydłowski, Marek; Krawiec, Adam; Kurek, Aleksandra; Kamionka, Michał
2015-01-01
Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting CDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the CDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), , baryon acoustic oscillation, the Alcock-Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting CDM model when compared to the CDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the CDM model. Given the weak or almost non-existing support for the interacting CDM model and bearing in mind Occam's razor we are inclined to reject this model.
AIC, BIC, Bayesian evidence against the interacting dark energy model
Energy Technology Data Exchange (ETDEWEB)
Szydłowski, Marek, E-mail: marek.szydlowski@uj.edu.pl [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Krawiec, Adam, E-mail: adam.krawiec@uj.edu.pl [Institute of Economics, Finance and Management, Jagiellonian University, Łojasiewicza 4, 30-348, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Kurek, Aleksandra, E-mail: alex@oa.uj.edu.pl [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Kamionka, Michał, E-mail: kamionka@astro.uni.wroc.pl [Astronomical Institute, University of Wrocław, ul. Kopernika 11, 51-622, Wrocław (Poland)
2015-01-14
Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam’s principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock–Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam’s razor we are inclined to reject this model.
AIC, BIC, Bayesian evidence against the interacting dark energy model
Energy Technology Data Exchange (ETDEWEB)
Szydlowski, Marek [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Krawiec, Adam [Jagiellonian University, Institute of Economics, Finance and Management, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Kurek, Aleksandra [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Kamionka, Michal [University of Wroclaw, Astronomical Institute, Wroclaw (Poland)
2015-01-01
Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative - the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock- Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam's razor we are inclined to reject this model. (orig.)
Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions
Directory of Open Access Journals (Sweden)
Camaren Peter
2014-03-01
Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.
Vocational intervention based on the Model of Human Occupation: a review of evidence.
Lee, Jenica; Kielhofner, Gary
2010-09-01
Work is a growing concern in disability and rehabilitation fields. Specific evidence related to occupational therapy in the area of vocational rehabilitation is somewhat limited. With increased demands for occupation-focused, evidence-based, and theory-informed practice, this review aims to use clinically relevant questions to organize and synthesize evidence regarding work-related interventions specifically related to an occupation-focused theory, the Model of Human Occupation. A total of 45 published works related to both the MOHO and vocational issues were identified and included in the review. The review demonstrates that there is a range of evidence that supports the use of the MOHO and its tools as a basis for work-based clinical interventions. Evidence supports the conclusion that MOHO-based work assessments have good psychometric properties and are useful in evaluating vocational potential and needs. MOHO-based work programs have been shown to have a positive impact in improving vocational outcomes to a range of clients.
Educational Program Evaluation Model, From the Perspective of the New Theories
Directory of Open Access Journals (Sweden)
Soleiman Ahmady
2014-05-01
Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.
Rational choice theory and Becker's model of random behavior
Directory of Open Access Journals (Sweden)
Krstić Miloš
2015-01-01
Full Text Available According to rational choice theory, rational consumers tend to maximize utility under a given budget constraints. This will be achieved if they choose a combination of goods that can satisfy their needs and provide the maximum level of utility. Gary Becker, on the other hand, imagines irrational consumers who choose bundle on the budget line. As irrational consumers have an equal probability of choosing any bundle on the budget line, on average, we expect that they will pick the bundle lying at the midpoint of the line. The results of research in which artificial Becker's agents choose among more than two commodities, rational choice theory is small and more than two budget/price situations show that the percentage of agents whose behavior violate. Adding some factors to Becker's model of random behavior, experimenters can minimize these minor violations. Therefore, rational choice theory is unfalsifiable. The results of our research have confirmed this theory. In addition, in the paper we discussed about explanatory value of rational choice theory in specific circumstances (positive substitution effect and we concluded that the explanatory value of rational choice theory was significantly reduced in specific cases.
Pilot evaluation in TENCompetence: a theory-driven model
Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Specht, Marcus; Glahn, Christian; Stefanov, Krassen
2007-01-01
Schoonenboom, J., Sligte, H., Moghnieh, A., Specht, M., Glahn, C., & Stefanov, K. (2007). Pilot evaluation in TENCompetence: a theory-driven model. In T. Navarette, J. Blat & R. Koper (Eds.). Proceedings of the 3rd TENCompetence Open Workshop 'Current Research on IMS Learning Design and Lifelong Com
Anisotropic cosmological models and generalized scalar tensor theory
Indian Academy of Sciences (India)
Subenoy Chakraborty; Batul Chandra Santra; Nabajit Chakravarty
2003-10-01
In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous ﬂuid, both exponential and power-law solutions have been studied and some assumptions among the physical parameters and solutions have been discussed.
Stochastic models in risk theory and management accounting
Brekelmans, R.C.M.
2000-01-01
This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest
Conceptualizations of Creativity: Comparing Theories and Models of Giftedness
Miller, Angie L.
2012-01-01
This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…
Classical and Quantum Theory of Perturbations in Inflationary Universe Models
Brandenberger, R H; Mukhanov, V
1993-01-01
A brief introduction to the gauge invariant classical and quantum theory of cosmological perturbations is given. The formalism is applied to inflationary Universe models and yields a consistent and unified description of the generation and evolution of fluctuations. A general formula for the amplitude of cosmological perturbations in inflationary cosmology is derived.
Multilevel Higher-Order Item Response Theory Models
Huang, Hung-Yu; Wang, Wen-Chung
2014-01-01
In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…
Evaluating hydrological model performance using information theory-based metrics
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Stochastic models in risk theory and management accounting
Brekelmans, R.C.M.
2000-01-01
This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest for
A Proposed Model of Jazz Theory Knowledge Acquisition
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
[General systems theory, analog models and essential arterial hypertension].
Indovina, I; Bonelli, M
1991-02-15
The application of the General System Theory to the fields of biology and particularly of medicine is fraught with many difficulties deriving from the mathematical complexities of application. The authors suggest that these difficulties can be overcome by applying analogical models, thus opening new prospects for the resolution of the manifold problems involved in connection with the study of arterial hypertension.
Conceptualizations of Creativity: Comparing Theories and Models of Giftedness
Miller, Angie L.
2012-01-01
This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…
A Proposed Model of Jazz Theory Knowledge Acquisition
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
Application of Health Promotion Theories and Models for Environmental Health
Parker, Edith A.; Baldwin, Grant T.; Israel, Barbara; Salinas, Maria A.
2004-01-01
The field of environmental health promotion gained new prominence in recent years as awareness of physical environmental stressors and exposures increased in communities across the country and the world. Although many theories and conceptual models are used routinely to guide health promotion and health education interventions, they are rarely…
Using Conceptual Change Theories to Model Position Concepts in Astronomy
Yang, Chih-Chiang; Hung, Jeng-Fung
2012-01-01
The roles of conceptual change and model building in science education are very important and have a profound and wide effect on teaching science. This study examines the change in children's position concepts after instruction, based on different conceptual change theories. Three classes were chosen and divided into three groups, including a…
Stochastic models in risk theory and management accounting
Brekelmans, R.C.M.
2000-01-01
This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest for
Using SAS PROC MCMC for Item Response Theory Models
Ames, Allison J.; Samonte, Kelli
2015-01-01
Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…
Cirafici, M.; Sinkovics, A.; Szabo, R.J.
2009-01-01
We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques
Modelling Based Approach for Reconstructing Evidence of VOIP Malicious Attacks
Directory of Open Access Journals (Sweden)
Mohammed Ibrahim
2015-05-01
Full Text Available Voice over Internet Protocol (VoIP is a new communication technology that uses internet protocol in providing phone services. VoIP provides various forms of benefits such as low monthly fee and cheaper rate in terms of long distance and international calls. However, VoIP is accompanied with novel security threats. Criminals often take advantages of such security threats and commit illicit activities. These activities require digital forensic experts to acquire, analyses, reconstruct and provide digital evidence. Meanwhile, there are various methodologies and models proposed in detecting, analysing and providing digital evidence in VoIP forensic. However, at the time of writing this paper, there is no model formalized for the reconstruction of VoIP malicious attacks. Reconstruction of attack scenario is an important technique in exposing the unknown criminal acts. Hence, this paper will strive in addressing that gap. We propose a model for reconstructing VoIP malicious attacks. To achieve that, a formal logic approach called Secure Temporal Logic of Action(S-TLA+ was adopted in rebuilding the attack scenario. The expected result of this model is to generate additional related evidences and their consistency with the existing evidences can be determined by means of S-TLA+ model checker.
Theory and modelling of diamond fracture from an atomic perspective.
Brenner, Donald W; Shenderova, Olga A
2015-03-28
Discussed in this paper are several theoretical and computational approaches that have been used to better understand the fracture of both single-crystal and polycrystalline diamond at the atomic level. The studies, which include first principles calculations, analytic models and molecular simulations, have been chosen to illustrate the different ways in which this problem has been approached, the conclusions and their reliability that have been reached by these methods, and how these theory and modelling methods can be effectively used together.
Flipped classroom model for learning evidence-based medicine
Directory of Open Access Journals (Sweden)
Rucker SY
2017-08-01
Full Text Available Sydney Y Rucker,1 Zulfukar Ozdogan,1 Morhaf Al Achkar2 1School of Education, Indiana University, Bloomington, IN, 2Department of Family Medicine, School of Medicine, University of Washington, Seattle, WA, USA Abstract: Journal club (JC, as a pedagogical strategy, has long been used in graduate medical education (GME. As evidence-based medicine (EBM becomes a mainstay in GME, traditional models of JC present a number of insufficiencies and call for novel models of instruction. A flipped classroom model appears to be an ideal strategy to meet the demands to connect evidence to practice while creating engaged, culturally competent, and technologically literate physicians. In this article, we describe a novel model of flipped classroom in JC. We present the flow of learning activities during the online and face-to-face instruction, and then we highlight specific considerations for implementing a flipped classroom model. We show that implementing a flipped classroom model to teach EBM in a residency program not only is possible but also may constitute improved learning opportunity for residents. Follow-up work is needed to evaluate the effectiveness of this model on both learning and clinical practice. Keywords: evidence-based medicine, flipped classroom, residency education
Charge transport in high mobility molecular semiconductors: classical models and new theories.
Troisi, Alessandro
2011-05-01
The theories developed since the fifties to describe charge transport in molecular crystals proved to be inadequate for the most promising classes of high mobility molecular semiconductors identified in the recent years, including for example pentacene and rubrene. After reviewing at an elementary level the classical theories, which still provide the language for the understanding of charge transport in these systems, this tutorial review outlines the recent experimental and computational evidence that prompted the development of new theories of charge transport in molecular crystals. A critical discussion will illustrate how very rarely it is possible to assume a charge hopping mechanism for high mobility organic crystals at any temperature. Recent models based on the effect of non-local electron-phonon coupling, dynamic disorder, coexistence of localized and delocalized states are reviewed. Additionally, a few more recent avenues of theoretical investigation, including the study of defect states, are discussed.
Directory of Open Access Journals (Sweden)
Carol A. Gordon
2009-09-01
Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action
On ADE Quiver Models and F-Theory Compactification
Belhaj, A; Sebbar, A; Sedra, M B
2006-01-01
Based on mirror symmetry, we discuss geometric engineering of N=1 ADE quiver models from F-theory compactifications on elliptic K3 surfaces fibered over certain four-dimensional base spaces. The latter are constructed as intersecting 4-cycles according to $ADE$ Dynkin diagrams, thereby mimicking the construction of Calabi-Yau threefolds used in geometric engineering in type II superstring theory. Matter is incorporated by considering D7-branes wrapping these 4-cycles. Using a geometric procedure referred to as folding, we discuss how the corresponding physics can be converted into a scenario with D5-branes wrapping 2-cycles of ALE spaces.
On ADE quiver models and F-theory compactification
Energy Technology Data Exchange (ETDEWEB)
Belhaj, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Rasmussen, J [Department of Mathematics and Statistics, University of Melbourne, Parkville, Victoria 3010 (Australia); Sebbar, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Sedra, M B [Laboratoire de Physique de la Matiere et Rayonnement (LPMR), Morocco Faculte des Sciences, Universite Ibn Tofail, Kenitra, Morocco (Morocco)
2006-07-21
Based on mirror symmetry, we discuss geometric engineering of N = 1 ADE quiver models from F-theory compactifications on elliptic K3 surfaces fibred over certain four-dimensional base spaces. The latter are constructed as intersecting 4-cycles according to ADE Dynkin diagrams, thereby mimicking the construction of Calabi-Yau threefolds used in geometric engineering in type II superstring theory. Matter is incorporated by considering D7-branes wrapping these 4-cycles. Using a geometric procedure referred to as folding, we discuss how the corresponding physics can be converted into a scenario with D5-branes wrapping 2-cycles of ALE spaces.
Theory-based Bayesian models of inductive learning and reasoning.
Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles
2006-07-01
Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.
Hirvonen, Åsa; Kossak, Roman; Villaveces, Andrés
2015-01-01
In recent years, mathematical logic has developed in many directions, the initial unity of its subject matter giving way to a myriad of seemingly unrelated areas. The articles collected here, which range from historical scholarship to recent research in geometric model theory, squarely address this development. These articles also connect to the diverse work of Väänänen, whose ecumenical approach to logic reflects the unity of the discipline.
Farrant, Brad M; Fletcher, Janet; Maybery, Murray T
2006-01-01
Recent research has found that the acquisition of theory of mind (ToM) is delayed in children with specific language impairment (SLI). The present study used a battery of ToM and visual perspective taking (VPT) tasks to investigate whether the delayed acquisition of ToM in children with SLI is associated with delayed VPT development. Harris' (1992, 1996) simulation theory predicts that the development of VPT will be delayed. Participants were 20 children with SLI (M=62.9 months) and 20 typically developing children (M=61.2 months) who were matched for nonverbal ability, gender, and age. The results supported Harris' theory and a role for language in ToM and VPT development.
Models of social entrepreneurship: empirical evidence from Mexico
Wulleman, Marine; Hudon, Marek
2015-01-01
This paper seeks to improve the understanding of social entrepreneurship models based on empirical evidence from Mexico, where social entrepreneurship is currently booming. It aims to supplement existing typologies of social entrepreneurship models. To that end, building on Zahra et al. (2009) typology it begins by providing a new framework classifying the three types of social entrepreneurship. A comparative case study of ten Mexican social enterprises is then elaborated using that framework...
Cluster density functional theory for lattice models based on the theory of Möbius functions
Lafuente, Luis; Cuesta, José A.
2005-08-01
Rosenfeld's fundamental-measure theory for lattice models is given a rigorous formulation in terms of the theory of Möbius functions of partially ordered sets. The free-energy density functional is expressed as an expansion in a finite set of lattice clusters. This set is endowed with a partial order, so that the coefficients of the cluster expansion are connected to its Möbius function. Because of this, it is rigorously proven that a unique such expansion exists for any lattice model. The low-density analysis of the free-energy functional motivates a redefinition of the basic clusters (zero-dimensional cavities) which guarantees a correct zero-density limit of the pair and triplet direct correlation functions. This new definition extends Rosenfeld's theory to lattice models with any kind of short-range interaction (repulsive or attractive, hard or soft, one or multicomponent ...). Finally, a proof is given that these functionals have a consistent dimensional reduction, i.e. the functional for dimension d' can be obtained from that for dimension d (d' < d) if the latter is evaluated at a density profile confined to a d'-dimensional subset.
Cluster density functional theory for lattice models based on the theory of Moebius functions
Energy Technology Data Exchange (ETDEWEB)
Lafuente, Luis; Cuesta, Jose A [Grupo Interdisciplinar de Sistemas Complejos (GISC), Departamento de Matematicas, Universidad Carlos III de Madrid, 28911 Leganes, Madrid (Spain)
2005-08-26
Rosenfeld's fundamental-measure theory for lattice models is given a rigorous formulation in terms of the theory of Moebius functions of partially ordered sets. The free-energy density functional is expressed as an expansion in a finite set of lattice clusters. This set is endowed with a partial order, so that the coefficients of the cluster expansion are connected to its Moebius function. Because of this, it is rigorously proven that a unique such expansion exists for any lattice model. The low-density analysis of the free-energy functional motivates a redefinition of the basic clusters (zero-dimensional cavities) which guarantees a correct zero-density limit of the pair and triplet direct correlation functions. This new definition extends Rosenfeld's theory to lattice models with any kind of short-range interaction (repulsive or attractive, hard or soft, one or multicomponent ...). Finally, a proof is given that these functionals have a consistent dimensional reduction, i.e. the functional for dimension d' can be obtained from that for dimension d (d' < d) if the latter is evaluated at a density profile confined to a d'-dimensional subset.
Models for probability and statistical inference theory and applications
Stapleton, James H
2007-01-01
This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...
Modeling size effect in the SMA response: a gradient theory
Tabesh, Majid; Boyd, James G.; Lagoudas, Dimitris C.
2014-03-01
Shape memory alloys (SMAs) show size effect in their response. The critical stresses, for instance, for the start of martensite and austenite transformations are reported to increase in some SMA wires for diameters below 100 μm. Simulation of such a behavior cannot be achieved using conventional theories that lack an intrinsic length scale in their constitutive modeling. To enable the size effect, a thermodynamically consistent constitutive model is developed, that in addition to conventional internal variables of martensitic volume fraction and transformation strain, contains the spatial gradient of martensitic volume fraction as an internal variable. The developed theory is simplified for 1D cases and analytical solutions for pure bending of SMA beams are presented. The gradient model captures the size effect in the response of the studied SMA structures.
Fracture and ductile vs. brittle behavior -- Theory, modeling and experiment
Energy Technology Data Exchange (ETDEWEB)
Beltz, G.E. [ed.] [Univ. of California, Santa Barbara, CA (United States); Selinger, R.L.B. [ed.] [Catholic Univ., Washington, DC (United States); Kim, K.S. [ed.] [Brown Univ., Providence, RI (United States); Marder, M.P. [ed.] [Univ. of Texas, Austin, TX (United States)
1999-08-01
The symposium brought together the many communities that investigate the fundamentals of fracture, with special emphasis on the ductile/brittle transition across a broad spectrum of material classes, fracture at interfaces, and modelling fracture over various length scales. Theoretical techniques discussed ranged from first-principles electronic structure theory to atomistic simulation to mesoscale and continuum theories, along with studies of fractals and scaling in fracture. Experimental and theoretical talks were interspersed throughout all sessions, rather than being segregated. The contributions to this volume generally follow the topical outline upon which the symposium was organized. The first part, dealing with ductile vs. brittle behavior in metals, concerns itself with investigations of high-strength steel, magnesium alloys, ordered intermetallics, and Fe-Cr-Al alloys. The development of analytical methods based on micromechanical models, such as dislocation mechanics and cohesive/contact zone models, are covered in a follow-up section. Nonmetals, including silicon, are considered in Parts 3 and 4. Fractals, chaos, and scaling theories are taken up in Part 5, with a special emphasis on fracture in heterogeneous solids. Modelling based on large populations of dislocations has substantially progressed during the past three years; hence, a section devoted to crystal plasticity and mesoscale dislocation modelling appears next. Finally, the technologically significant area of interfacial fracture, with applications to composites and intergranular fracture, is taken up in Part 7. Separate abstracts were prepared for most of the papers in this book.
Social learning theory and the Health Belief Model.
Rosenstock, I M; Strecher, V J; Becker, M H
1988-01-01
The Health Belief Model, social learning theory (recently relabelled social cognitive theory), self-efficacy, and locus of control have all been applied with varying success to problems of explaining, predicting, and influencing behavior. Yet, there is conceptual confusion among researchers and practitioners about the interrelationships of these theories and variables. This article attempts to show how these explanatory factors may be related, and in so doing, posits a revised explanatory model which incorporates self-efficacy into the Health Belief Model. Specifically, self-efficacy is proposed as a separate independent variable along with the traditional health belief variables of perceived susceptibility, severity, benefits, and barriers. Incentive to behave (health motivation) is also a component of the model. Locus of control is not included explicitly because it is believed to be incorporated within other elements of the model. It is predicted that the new formulation will more fully account for health-related behavior than did earlier formulations, and will suggest more effective behavioral interventions than have hitherto been available to health educators.
Matrix models and stochastic growth in Donaldson-Thomas theory
Energy Technology Data Exchange (ETDEWEB)
Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)
2012-10-15
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Matrix models and stochastic growth in Donaldson-Thomas theory
Szabo, Richard J.; Tierz, Miguel
2012-10-01
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models
Strand, Hugo U. R.; Eckstein, Martin; Werner, Philipp
2015-01-01
We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium "phase diagrams" that map out the different dynamical regimes.
Energy Technology Data Exchange (ETDEWEB)
Kakad, Amar [Research Institute for Sustainable Humanosphere, Kyoto University, Uji, Kyoto 611-0011 (Japan); Indian Institute of Geomagnetism, New Panvel, Navi Mumbai 410-218 (India); Omura, Yoshiharu [Research Institute for Sustainable Humanosphere, Kyoto University, Uji, Kyoto 611-0011 (Japan); Kakad, Bharati [Indian Institute of Geomagnetism, New Panvel, Navi Mumbai 410-218 (India)
2013-06-15
We perform one-dimensional fluid simulation of ion acoustic (IA) solitons propagating parallel to the magnetic field in electron-ion plasmas by assuming a large system length. To model the initial density perturbations (IDP), we employ a KdV soliton type solution. Our simulation demonstrates that the generation mechanism of IA solitons depends on the wavelength of the IDP. The short wavelength IDP evolve into two oppositely propagating identical IA solitons, whereas the long wavelength IDP develop into two indistinguishable chains of multiple IA solitons through a wave breaking process. The wave breaking occurs close to the time when electrostatic energy exceeds half of the kinetic energy of the electron fluid. The wave breaking amplitude and time of its initiation are found to be dependent on characteristics of the IDP. The strength of the IDP controls the number of IA solitons in the solitary chains. The speed, width, and amplitude of IA solitons estimated during their stable propagation in the simulation are in good agreement with the nonlinear fluid theory. This fluid simulation is the first to confirm the validity of the general nonlinear fluid theory, which is widely used in the study of solitary waves in laboratory and space plasmas.
Should the model for risk-informed regulation be game theory rather than decision theory?
Bier, Vicki M; Lin, Shi-Woei
2013-02-01
deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.
Sensor Data Fusion for Accurate Cloud Presence Prediction Using Dempster-Shafer Evidence Theory
Directory of Open Access Journals (Sweden)
Jesse S. Jin
2010-10-01
Full Text Available Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.
Sensor data fusion for accurate cloud presence prediction using Dempster-Shafer evidence theory.
Li, Jiaming; Luo, Suhuai; Jin, Jesse S
2010-01-01
Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.
Application of evidence theory in information fusion of multiple sources in bayesian analysis
Institute of Scientific and Technical Information of China (English)
周忠宝; 蒋平; 武小悦
2004-01-01
How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.
The Function of Gas Vesicles in Halophilic Archaea and Bacteria: Theories and Experimental Evidence
Directory of Open Access Journals (Sweden)
Aharon Oren
2012-12-01
Full Text Available A few extremely halophilic Archaea (Halobacterium salinarum, Haloquadratum walsbyi, Haloferax mediterranei, Halorubrum vacuolatum, Halogeometricum borinquense, Haloplanus spp. possess gas vesicles that bestow buoyancy on the cells. Gas vesicles are also produced by the anaerobic endospore-forming halophilic Bacteria Sporohalobacter lortetii and Orenia sivashensis. We have extensive information on the properties of gas vesicles in Hbt. salinarum and Hfx. mediterranei and the regulation of their formation. Different functions were suggested for gas vesicle synthesis: buoying cells towards oxygen-rich surface layers in hypersaline water bodies to prevent oxygen limitation, reaching higher light intensities for the light-driven proton pump bacteriorhodopsin, positioning the cells optimally for light absorption, light shielding, reducing the cytoplasmic volume leading to a higher surface-area-to-volume ratio (for the Archaea and dispersal of endospores (for the anaerobic spore-forming Bacteria. Except for Hqr. walsbyi which abounds in saltern crystallizer brines, gas-vacuolate halophiles are not among the dominant life forms in hypersaline environments. There only has been little research on gas vesicles in natural communities of halophilic microorganisms, and the few existing studies failed to provide clear evidence for their possible function. This paper summarizes the current status of the different theories why gas vesicles may provide a selective advantage to some halophilic microorganisms.
DEFF Research Database (Denmark)
Plant, Peter
2012-01-01
Quality assurance and evidence in career guidance in Europe are often seen as self-evident approaches, but particular interests lie behind......Quality assurance and evidence in career guidance in Europe are often seen as self-evident approaches, but particular interests lie behind...
Cahill, James A; Green, Richard E; Fulton, Tara L; Stiller, Mathias; Jay, Flora; Ovsyanikov, Nikita; Salamzade, Rauf; St John, John; Stirling, Ian; Slatkin, Montgomery; Shapiro, Beth
2013-01-01
Despite extensive genetic analysis, the evolutionary relationship between polar bears (Ursus maritimus) and brown bears (U. arctos) remains unclear. The two most recent comprehensive reports indicate a recent divergence with little subsequent admixture or a much more ancient divergence followed by extensive admixture. At the center of this controversy are the Alaskan ABC Islands brown bears that show evidence of shared ancestry with polar bears. We present an analysis of genome-wide sequence data for seven polar bears, one ABC Islands brown bear, one mainland Alaskan brown bear, and a black bear (U. americanus), plus recently published datasets from other bears. Surprisingly, we find clear evidence for gene flow from polar bears into ABC Islands brown bears but no evidence of gene flow from brown bears into polar bears. Importantly, while polar bears contributed <1% of the autosomal genome of the ABC Islands brown bear, they contributed 6.5% of the X chromosome. The magnitude of sex-biased polar bear ancestry and the clear direction of gene flow suggest a model wherein the enigmatic ABC Island brown bears are the descendants of a polar bear population that was gradually converted into brown bears via male-dominated brown bear admixture. We present a model that reconciles heretofore conflicting genetic observations. We posit that the enigmatic ABC Islands brown bears derive from a population of polar bears likely stranded by the receding ice at the end of the last glacial period. Since then, male brown bear migration onto the island has gradually converted these bears into an admixed population whose phenotype and genotype are principally brown bear, except at mtDNA and X-linked loci. This process of genome erosion and conversion may be a common outcome when climate change or other forces cause a population to become isolated and then overrun by species with which it can hybridize.
Directory of Open Access Journals (Sweden)
James A Cahill
Full Text Available Despite extensive genetic analysis, the evolutionary relationship between polar bears (Ursus maritimus and brown bears (U. arctos remains unclear. The two most recent comprehensive reports indicate a recent divergence with little subsequent admixture or a much more ancient divergence followed by extensive admixture. At the center of this controversy are the Alaskan ABC Islands brown bears that show evidence of shared ancestry with polar bears. We present an analysis of genome-wide sequence data for seven polar bears, one ABC Islands brown bear, one mainland Alaskan brown bear, and a black bear (U. americanus, plus recently published datasets from other bears. Surprisingly, we find clear evidence for gene flow from polar bears into ABC Islands brown bears but no evidence of gene flow from brown bears into polar bears. Importantly, while polar bears contributed <1% of the autosomal genome of the ABC Islands brown bear, they contributed 6.5% of the X chromosome. The magnitude of sex-biased polar bear ancestry and the clear direction of gene flow suggest a model wherein the enigmatic ABC Island brown bears are the descendants of a polar bear population that was gradually converted into brown bears via male-dominated brown bear admixture. We present a model that reconciles heretofore conflicting genetic observations. We posit that the enigmatic ABC Islands brown bears derive from a population of polar bears likely stranded by the receding ice at the end of the last glacial period. Since then, male brown bear migration onto the island has gradually converted these bears into an admixed population whose phenotype and genotype are principally brown bear, except at mtDNA and X-linked loci. This process of genome erosion and conversion may be a common outcome when climate change or other forces cause a population to become isolated and then overrun by species with which it can hybridize.
Farrant, Brad M.; Fletcher, Janet; Maybery, Murray T.
2006-01-01
Recent research has found that the acquisition of theory of mind (ToM) is delayed in children with specific language impairment (SLI). The present study used a battery of ToM and visual perspective taking (VPT) tasks to investigate whether the delayed acquisition of ToM in children with SLI is associated with delayed VPT development. Harris'…
Kitaev Lattice Models as a Hopf Algebra Gauge Theory
Meusburger, Catherine
2017-07-01
We prove that Kitaev's lattice model for a finite-dimensional semisimple Hopf algebra H is equivalent to the combinatorial quantisation of Chern-Simons theory for the Drinfeld double D( H). This shows that Kitaev models are a special case of the older and more general combinatorial models. This equivalence is an analogue of the relation between Turaev-Viro and Reshetikhin-Turaev TQFTs and relates them to the quantisation of moduli spaces of flat connections. We show that the topological invariants of the two models, the algebra of operators acting on the protected space of the Kitaev model and the quantum moduli algebra from the combinatorial quantisation formalism, are isomorphic. This is established in a gauge theoretical picture, in which both models appear as Hopf algebra valued lattice gauge theories. We first prove that the triangle operators of a Kitaev model form a module algebra over a Hopf algebra of gauge transformations and that this module algebra is isomorphic to the lattice algebra in the combinatorial formalism. Both algebras can be viewed as the algebra of functions on gauge fields in a Hopf algebra gauge theory. The isomorphism between them induces an algebra isomorphism between their subalgebras of invariants, which are interpreted as gauge invariant functions or observables. It also relates the curvatures in the two models, which are given as holonomies around the faces of the lattice. This yields an isomorphism between the subalgebras obtained by projecting out curvatures, which can be viewed as the algebras of functions on flat gauge fields and are the topological invariants of the two models.
Nonrelativistic factorizable scattering theory of multicomponent Calogero-Sutherland model
Ahn, C; Nam, S; Ahn, Changrim; Lee, Kong Ju Bock; Nam, Soonkeon
1995-01-01
We relate two integrable models in (1+1) dimensions, namely, multicomponent Calogero-Sutherland model with particles and antiparticles interacting via the hyperbolic potential and the nonrelativistic factorizable S-matrix theory with SU(N)-invariance. We find complete solutions of the Yang-Baxter equations without implementing the crossing symmetry, and one of them is identified with the scattering amplitudes derived from the Schr\\"{o}dinger equation of the Calogero-Sutherland model. This particular solution is of interest in that it cannot be obtained as a nonrelativistic limit of any known relativistic solutions of the SU(N)-invariant Yang-Baxter equations.
Massive mu pair production in a vector field theory model
Halliday, I G
1976-01-01
Massive electrodynamics is treated as a model for the production of massive mu pairs in high-energy hadronic collisions. The dominant diagrams in perturbation theory are identified and analyzed. These graphs have an eikonal structure which leads to enormous cancellations in the two-particle inclusive cross section but not in the n-particle production cross sections. Under the assumption that these cancellations are complete, a Drell-Yan structure appears in the inclusive cross section but the particles accompanying the mu pairs have a very different structure compared to the parton model. The pionization region is no longer empty of particles as in single parton models. (10 refs).
Entanglement of Conceptual Entities in Quantum Model Theory (QMod)
Aerts, Diederik
2012-01-01
We have recently elaborated 'Quantum Model Theory' (QMod) to model situations where the quantum effects of contextuality, interference, superposition, entanglement and emergence, appear without the entities giving rise to these situations having necessarily to be of microscopic nature. We have shown that QMod models without introducing linearity for the set of the states. In this paper we prove that QMod, although not using linearity for the state space, provides a method of identification for entangled states and an intuitive explanation for their occurrence. We illustrate this method for entanglement identification with concrete examples.
Teacher Education under Audit: Value-Added Measures,TVAAS, EdTPA and Evidence-Based Theory
Price, Todd Alan
2014-01-01
This article describes how evidence-based theory fuels an audit culture for teacher education in the USA, placing faculty under monitoring and surveillance, and severely constraining judgment, discretion, and professional decision-making. The national education reform efforts, Race to the Top and Common Core State Standards, demand fealty to…
Dentoni, D.; Ross, R.
2013-01-01
Part Two of our Special Issue on wicked problems in agribusiness, “Towards a Theory of Managing Wicked Problems through Multi-Stakeholder Engagements: Evidence from the Agribusiness Sector,” will contribute to four open questions in the broader fields of management and policy: why, when, which and h