Item Response Theory Models for Performance Decline during Testing
Jin, Kuan-Yu; Wang, Wen-Chung
2014-01-01
Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…
Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items
Aybek, Eren Can; Demirtasli, R. Nukhet
2017-01-01
This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…
International Nuclear Information System (INIS)
Casten, R F
2015-01-01
This paper discusses some simple issues that arise in testing models, with a focus on models for low energy nuclear structure. By way of simplified examples, we illustrate some dangers in blind statistical assessments, pointing out especially the need to include theoretical uncertainties, the danger of over-weighting precise or physically redundant experimental results, the need to assess competing theories with independent and physically sensitive observables, and the value of statistical tests properly evaluated. (paper)
Analyzing Test-Taking Behavior: Decision Theory Meets Psychometric Theory.
Budescu, David V; Bo, Yuanchao
2015-12-01
We investigate the implications of penalizing incorrect answers to multiple-choice tests, from the perspective of both test-takers and test-makers. To do so, we use a model that combines a well-known item response theory model with prospect theory (Kahneman and Tversky, Prospect theory: An analysis of decision under risk, Econometrica 47:263-91, 1979). Our results reveal that when test-takers are fully informed of the scoring rule, the use of any penalty has detrimental effects for both test-takers (they are always penalized in excess, particularly those who are risk averse and loss averse) and test-makers (the bias of the estimated scores, as well as the variance and skewness of their distribution, increase as a function of the severity of the penalty).
Item Response Theory Modeling of the Philadelphia Naming Test
Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D.
2015-01-01
Purpose: In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating…
Statistical test theory for the behavioral sciences
de Gruijter, Dato N M
2007-01-01
Since the development of the first intelligence test in the early 20th century, educational and psychological tests have become important measurement techniques to quantify human behavior. Focusing on this ubiquitous yet fruitful area of research, Statistical Test Theory for the Behavioral Sciences provides both a broad overview and a critical survey of assorted testing theories and models used in psychology, education, and other behavioral science fields. Following a logical progression from basic concepts to more advanced topics, the book first explains classical test theory, covering true score, measurement error, and reliability. It then presents generalizability theory, which provides a framework to deal with various aspects of test scores. In addition, the authors discuss the concept of validity in testing, offering a strategy for evidence-based validity. In the two chapters devoted to item response theory (IRT), the book explores item response models, such as the Rasch model, and applications, incl...
Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat
2018-03-01
Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Comparison of model propeller tests with airfoil theory
Durand, William F; Lesley, E P
1925-01-01
The purpose of the investigation covered by this report was the examination of the degree of approach which may be anticipated between laboratory tests on model airplane propellers and results computed by the airfoil theory, based on tests of airfoils representative of successive blade sections. It is known that the corrections of angles of attack and for aspect ratio, speed, and interference rest either on experimental data or on somewhat uncertain theoretical assumptions. The general situation as regards these four sets of corrections is far from satisfactory, and while it is recognized that occasion exists for the consideration of such corrections, their determination in any given case is a matter of considerable uncertainty. There exists at the present time no theory generally accepted and sufficiently comprehensive to indicate the amount of such corrections, and the application to individual cases of the experimental data available is, at best, uncertain. While the results of this first phase of the investigation are less positive than had been hoped might be the case, the establishment of the general degree of approach between the two sets of results which might be anticipated on the basis of this simpler mode of application seems to have been desirable.
Sebire, Simon J; Jago, Russell; Fox, Kenneth R; Edwards, Mark J; Thompson, Janice L
2013-09-26
Understanding children's physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children's physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children's minutes in moderate-to-vigorous physical activity. The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children's motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
2013-01-01
Background Understanding children’s physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children’s physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Methods Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. Results The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children’s minutes in moderate-to-vigorous physical activity. Conclusions The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children’s motivation that is based on enjoyment and inherent satisfaction of physical activity is
Xie, Qin; Andrews, Stephen
2013-01-01
This study introduces Expectancy-value motivation theory to explain the paths of influences from perceptions of test design and uses to test preparation as a special case of washback on learning. Based on this theory, two conceptual models were proposed and tested via Structural Equation Modeling. Data collection involved over 870 test takers of…
Rigorously testing multialternative decision field theory against random utility models.
Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg
2014-06-01
Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Raykov, Tenko; Marcoulides, George A.
2016-01-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete…
Pham, Tien Hung; Rühaak, Wolfram; Sass, Ingo
2017-04-01
Extensive groundwater extraction leads to a drawdown of the ground water table. Consequently, soil effective stress increases and can cause land subsidence. Analysis of land subsidence generally requires a numerical model based on poroelasticity theory, which was first proposed by Biot (1941). In the review of regional land subsidence accompanying groundwater extraction, Galloway and Burbey (2011) stated that more research and application is needed in coupling of stress-dependent land subsidence process. In geotechnical field, the constant rate of strain tests (CRS) was first introduced in 1969 (Smith and Wahls 1969) and was standardized in 1982 through the designation D4186-82 by American Society for Testing and Materials. From the reading values of CRS tests, the stress-dependent parameters of poroelasticity model can be calculated. So far, there is no research to link poroelasticity theory with CRS tests in modelling land subsidence due to groundwater extraction. One dimensional CRS tests using conventional compression cell and three dimension CRS tests using Rowe cell were performed. The tests were also modelled by using finite element method with mixed elements. Back analysis technique is used to find the suitable values of hydraulic conductivity and bulk modulus that depend on the stress or void ratio. Finally, the obtained results are used in land subsidence models. Biot, M. A. (1941). "General theory of three-dimensional consolidation." Journal of applied physics 12(2): 155-164. Galloway, D. L. and T. J. Burbey (2011). "Review: Regional land subsidence accompanying groundwater extraction." Hydrogeology Journal 19(8): 1459-1486. Smith, R. E. and H. E. Wahls (1969). "Consolidation under constant rates of strain." Journal of Soil Mechanics & Foundations Div.
Tests of Theories of Crime in Female Prisoners.
Lindberg, Marc A; Fugett, April; Adkins, Ashtin; Cook, Kelsey
2017-02-01
Several general theories of crime were tested with path models on 293 female prisoners in a U.S. State prison. The theories tested included Social Bond and Control, Thrill/Risk Seeking, and a new attachment-based Developmental Dynamic Systems model. A large battery of different instruments ranging from measures of risk taking, to a crime addiction scale, to Childhood Adverse Events, to attachments and clinical issues were used. The older general theories of crime did not hold up well under the rigor of path modeling. The new dynamic systems model was supported that incorporated adverse childhood events leading to (a) peer crime, (b) crime addiction, and (c) a measure derived from the Attachment and Clinical Issues Questionnaire (ACIQ) that takes individual differences in attachments and clinical issues into account. The results were discussed in terms of new approaches to Research Defined Criteria of Diagnosis (RDoC) and new approaches to intervention.
Fu, Jianbin
2016-01-01
The multidimensional item response theory (MIRT) models with covariates proposed by Haberman and implemented in the "mirt" program provide a flexible way to analyze data based on item response theory. In this report, we discuss applications of the MIRT models with covariates to longitudinal test data to measure skill differences at the…
Testicular Self-Examination: A Test of the Health Belief Model and the Theory of Planned Behaviour
McClenahan, Carol; Shevlin, Mark; Adamson, Gary; Bennett, Cara; O'Neill, Brenda
2007-01-01
The aim of this study was to test the utility and efficiency of the theory of planned behaviour (TPB) and the health belief model (HBM) in predicting testicular self-examination (TSE) behaviour. A questionnaire was administered to an opportunistic sample of 195 undergraduates aged 18-39 years. Structural equation modelling indicated that, on the…
International Nuclear Information System (INIS)
Lee, Gyeong Geun; Lee, Yong Bok; Kim, Min Chul; Kwon, Junh Yun
2012-01-01
Neutron irradiation to reactor pressure vessel (RPV) steels causes a decrease in fracture toughness and an increase in yield strength while in service. It is generally accepted that the growth of point defect cluster (PDC) and copper rich precipitate (CRP) affects radiation hardening of RPV steels. A number of models have been proposed to account for the embrittlement of RPV steels. The rate theory based modeling mathematically described the evolution of radiation induced microstructures of ferritic steels under neutron irradiation. In this work, we compared the rate theory based modeling calculation with the surveillance test results of Korean Light Water Reactors (LWRs)
Economic contract theory tests models of mutualism.
Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E
2010-09-07
Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.
Testing Ecological Theories of Offender Spatial Decision Making Using a Discrete Choice Model
Summers, Lucia
2015-01-01
Research demonstrates that crime is spatially concentrated. However, most research relies on information about where crimes occur, without reference to where offenders reside. This study examines how the characteristics of neighborhoods and their proximity to offender home locations affect offender spatial decision making. Using a discrete choice model and data for detected incidents of theft from vehicles (TFV), we test predictions from two theoretical perspectives—crime pattern and social disorganization theories. We demonstrate that offenders favor areas that are low in social cohesion and closer to their home, or other age-related activity nodes. For adult offenders, choices also appear to be influenced by how accessible a neighborhood is via the street network. The implications for criminological theory and crime prevention are discussed. PMID:25866412
Testing Ecological Theories of Offender Spatial Decision Making Using a Discrete Choice Model.
Johnson, Shane D; Summers, Lucia
2015-04-01
Research demonstrates that crime is spatially concentrated. However, most research relies on information about where crimes occur, without reference to where offenders reside. This study examines how the characteristics of neighborhoods and their proximity to offender home locations affect offender spatial decision making. Using a discrete choice model and data for detected incidents of theft from vehicles (TFV) , we test predictions from two theoretical perspectives-crime pattern and social disorganization theories. We demonstrate that offenders favor areas that are low in social cohesion and closer to their home, or other age-related activity nodes. For adult offenders, choices also appear to be influenced by how accessible a neighborhood is via the street network. The implications for criminological theory and crime prevention are discussed.
Theory testing using case studies
DEFF Research Database (Denmark)
Dissing Sørensen, Pernille; Løkke Nielsen, Ann-Kristina
2006-01-01
on the strengths of theory-testing case studies. We specify research paths associated with theory testing in case studies and present a coherent argument for the logic of theoretical development and refinement using case studies. We emphasize different uses of rival explanations and their implications for research...... design. Finally, we discuss the epistemological logic, i.e., the value to larger research programmes, of such studies and, following Lakatos, conclude that the value of theory-testing case studies lies beyond naïve falsification and in their contribution to developing research programmes in a progressive......Case studies may have different research goals. One such goal is the testing of small-scale and middle-range theories. Theory testing refers to the critical examination, observation, and evaluation of the 'why' and 'how' of a specified phenomenon in a particular setting. In this paper, we focus...
The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory
Anil, Duygu
2008-01-01
In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…
An empirical test of reference price theories using a semiparametric approach
DEFF Research Database (Denmark)
Boztug, Yasemin; Hildebrandt, Lutz
In this paper we estimate and empirically test different behavioral theories of consumer reference price formation. Two major theories are proposed to model the reference price reaction: assimilation contrast theory and prospect theory. We assume that different consumer segments will use...
Test theories of special relativity: a general critique
International Nuclear Information System (INIS)
Maciel, A.K.A.; Tiomno, J.
1988-01-01
Absolute Spacetime Theories conceived for the purpose of testing Special Relativity (SR) are reviewed. It is found that most theories proposed were in fact SR in different coordinate systems, since in general no specific SR violations were introduced. Models based on possible SR violating mechanisms are considered. Misconceptions in recently published papers are examined. (author) [pt
Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie
2013-01-01
Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
Theory of Test Translation Error
Solano-Flores, Guillermo; Backhoff, Eduardo; Contreras-Nino, Luis Angel
2009-01-01
In this article, we present a theory of test translation whose intent is to provide the conceptual foundation for effective, systematic work in the process of test translation and test translation review. According to the theory, translation error is multidimensional; it is not simply the consequence of defective translation but an inevitable fact…
International Nuclear Information System (INIS)
Runchal, A.K.; Sagar, B.; Baca, R.G.; Kline, N.W.
1985-09-01
Postclosure performance assessment of the proposed high-level nuclear waste repository in flood basalts at Hanford requires that the processes of fluid flow, heat transfer, and mass transport be numerically modeled at appropriate space and time scales. A suite of computer models has been developed to meet this objective. The theory of one of these models, named PORFLO, is described in this report. Also presented are a discussion of the numerical techniques in the PORFLO computer code and a few computational test cases. Three two-dimensional equations, one each for fluid flow, heat transfer, and mass transport, are numerically solved in PORFLO. The governing equations are derived from the principle of conservation of mass, momentum, and energy in a stationary control volume that is assumed to contain a heterogeneous, anisotropic porous medium. Broad discrete features can be accommodated by specifying zones with distinct properties, or these can be included by defining an equivalent porous medium. The governing equations are parabolic differential equations that are coupled through time-varying parameters. Computational tests of the model are done by comparisons of simulation results with analytic solutions, with results from other independently developed numerical models, and with available laboratory and/or field data. In this report, in addition to the theory of the model, results from three test cases are discussed. A users' manual for the computer code resulting from this model has been prepared and is available as a separate document. 37 refs., 20 figs., 15 tabs
Theory Testing Using Case Studies
DEFF Research Database (Denmark)
Møller, Ann-Kristina Løkke; Dissing Sørensen, Pernille
2014-01-01
The appropriateness of case studies as a tool for theory testing is still a controversial issue, and discussions about the weaknesses of such research designs have previously taken precedence over those about its strengths. The purpose of the paper is to examine and revive the approach of theory...... testing using case studies, including the associated research goal, analysis, and generalisability. We argue that research designs for theory testing using case studies differ from theorybuilding case study research designs because different research projects serve different purposes and follow different...... research paths....
Kinematical Test Theories for Special Relativity
Lämmerzahl, Claus; Braxmaier, Claus; Dittus, Hansjörg; Müller, Holger; Peters, Achim; Schiller, Stephan
A comparison of certain kinematical test theories for Special Relativity including the Robertson and Mansouri-Sext test theories is presented and the accuracy of the experimental results testing Special Relativity are expressed in terms of the parameters appearing in these test theories. The theoretical results are applied to the most precise experimental results obtained recently for the isotropy of light propagation and the constancy of the speed of light.
International Nuclear Information System (INIS)
Johnson, C.R.
1985-01-01
We develop a method for finding the exact equations of structure and motion of multipole test particles in Einstein's unified field theory: the theory of the nonsymmetric field. The method is also applicable to Einstein's gravitational theory. Particles are represented by singularities in the field. The method is covariant at each step of the analysis. We also apply the method and find both in Einstein's unified field theory and in Einstein's gravitational theory the equations of structure and motion of neutral pole-dipole test particles possessing no electromagnetic multipole moments. In the case of Einstein's gravitational theory the results are the well-known equations of structure and motion of a neutral pole-dipole test particle in a given background gravitational field. In the case of Einstein's unified field theory the results are the same, providing we identify a certain symmetric second-rank tensor field appearing in Einstein's theory with the metric and gravitational field. We therefore discover not only the equations of structure and motion of a neutral test particle in Einstein's unified field theory, but we also discover what field in Einstein's theory plays the role of metric and gravitational field
QTest: Quantitative Testing of Theories of Binary Choice.
Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.
QTest: Quantitative Testing of Theories of Binary Choice
Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495
Cao, Yi; Lu, Ru; Tao, Wei
2014-01-01
The local item independence assumption underlying traditional item response theory (IRT) models is often not met for tests composed of testlets. There are 3 major approaches to addressing this issue: (a) ignore the violation and use a dichotomous IRT model (e.g., the 2-parameter logistic [2PL] model), (b) combine the interdependent items to form a…
A signal detection-item response theory model for evaluating neuropsychological measures.
Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G
2018-02-05
Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the
Nursing intellectual capital theory: testing selected propositions.
Covell, Christine L; Sidani, Souraya
2013-11-01
To test the selected propositions of the middle-range theory of nursing intellectual capital. The nursing intellectual capital theory conceptualizes nursing knowledge's influence on patient and organizational outcomes. The theory proposes nursing human capital, nurses' knowledge, skills and experience, is related to the quality of patient care and nurse recruitment and retention of an inpatient care unit. Two factors in the work environment, nurse staffing and employer support for nurse continuing professional development, are proposed to influence nursing human capital's association with patient and organizational outcomes. A cross-sectional survey design. The study took place in 2008 in six Canadian acute care hospitals. Financial, human resource and risk data were collected from hospital departments and unit managers. Clearly specified empirical indicators quantified the study variables. The propositions of the theory were tested with data from 91 inpatient care units using structural equation modelling. The propositions associated with the nursing human capital concept were supported. The propositions associated with the employer support for nurse continuing professional development concept were not. The proposition that nurse staffing's influences on patient outcomes was mediated by the nursing human capital of an inpatient unit, was partially supported. Some of the theory's propositions were empirically validated. Additional theoretical work is needed to refine the operationalization and measurement of some of the theory's concepts. Further research with larger samples of data from different geographical settings and types of hospitals is required to determine if the theory can withstand empirical scrutiny. © 2013 Blackwell Publishing Ltd.
Item response theory analysis of the mechanics baseline test
Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.
2012-02-01
Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.
Hyland, Philip; Shevlin, Mark; Adamson, Gary; Boduszek, Daniel
2014-01-01
This study directly tests a central prediction of rational emotive behaviour therapy (REBT) that has received little empirical attention regarding the core and intermediate beliefs in the development of posttraumatic stress symptoms. A theoretically consistent REBT model of posttraumatic stress disorder (PTSD) was examined using structural equation modelling techniques among a sample of 313 trauma-exposed military and law enforcement personnel. The REBT model of PTSD provided a good fit of the data, χ(2) = 599.173, df = 356, p depreciation beliefs. Results were consistent with the predictions of REBT theory and provides strong empirical support that the cognitive variables described by REBT theory are critical cognitive constructs in the prediction of PTSD symptomology. © 2013 Wiley Periodicals, Inc.
Vo, Phuong T; Bogg, Tim
2015-01-01
Prior research identified assorted relations between trait and social cognition models of personality and engagement in physical activity. Using a representative U.S. sample (N = 957), the goal of the present study was to test two alternative structural models of the relationships among the extraversion-related facet of activity, the conscientiousness-related facet of industriousness, social cognitions from the Theory of Planned Behavior (perceived behavioral control, affective attitudes, subjective norms, intentions), Social Cognitive Theory (self-efficacy, outcome expectancies), and the Transtheoretical Model (behavioral processes of change), and engagement in physical activity. Path analyses with bootstrapping procedures were used to model direct and indirect effects of trait and social cognition constructs on physical activity through two distinct frameworks - the Theory of Planned Behavior and Neo-Socioanalytic Theory. While both models showed good internal fit, comparative model information criteria showed the Theory-of-Planned-Behavior-informed model provided a better fit. In the model, social cognitions fully mediated the relationships from the activity facet and industriousness to intentions for and engagement in physical activity, such that the relationships were primarily maintained by positive affective evaluations, positive expected outcomes, and confidence in overcoming barriers related to physical activity engagement. The resultant model - termed the Disposition-Belief-Motivation model- is proposed as a useful framework for organizing and integrating personality trait facets and social cognitions from various theoretical perspectives to investigate the expression of health-related behaviors, such as physical activity. Moreover, the results are discussed in terms of extending the application of the Disposition-Belief-Motivation model to longitudinal and intervention designs for physical activity engagement.
Towards a theory of tiered testing.
Hansson, Sven Ove; Rudén, Christina
2007-06-01
Tiered testing is an essential part of any resource-efficient strategy for the toxicity testing of a large number of chemicals, which is required for instance in the risk management of general (industrial) chemicals, In spite of this, no general theory seems to be available for the combination of single tests into efficient tiered testing systems. A first outline of such a theory is developed. It is argued that chemical, toxicological, and decision-theoretical knowledge should be combined in the construction of such a theory. A decision-theoretical approach for the optimization of test systems is introduced. It is based on expected utility maximization with simplified assumptions covering factual and value-related information that is usually missing in the development of test systems.
Item level diagnostics and model - data fit in item response theory ...
African Journals Online (AJOL)
Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...
Neutron Star Models in Alternative Theories of Gravity
Manolidis, Dimitrios
We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.
Two Tests of Maslow's Theory of Need Fulfillment.
Betz, Ellen L.
1984-01-01
Conducted a two-part test of Maslow's theory of human motivation and explored the relationships between need deficiencies and (1) need importance and (2) life satisfaction in female college graduates (N=474). Results support Maslow's model regarding need deficiencies and their relationship to life satisfaction. (LLL)
Childhood abuse and criminal behavior: testing a general strain theory model.
Watts, Stephen J; McNulty, Thomas L
2013-10-01
This article draws on general strain theory (GST) to develop and test a model of the childhood abuse-crime relationship. Using data from the National Longitudinal Study of Adolescent Health (Add Health),(1) we find that early childhood physical and sexual abuse are robust predictors of offending in adolescence, for the full sample and in equations disaggregated by gender. GST is partially supported in that the effects of childhood physical abuse on offending for both females and males are mediated by an index of depression symptoms, whereas the effect of sexual abuse among females appears to be mediated largely by closeness to mother. The effect of childhood sexual abuse among males, however, is more robust than among females and it persists despite controls for low self-control, ties to delinquent peers, school attachment, and closeness to mother. Theoretical implications of the findings are discussed.
Testing special relativity theory using Compton scattering
International Nuclear Information System (INIS)
Contreras S, H.; Hernandez A, L.; Baltazar R, A.; Escareno J, E.; Mares E, C. A.; Hernandez V, C.; Vega C, H. R.
2010-10-01
The validity of the special relativity theory has been tested using the Compton scattering. Since 1905 several experiments has been carried out to show that time, mass, and length change with the velocity, in this work the Compton scattering has been utilized as a simple way to show the validity to relativity. The work was carried out through Monte Carlo calculations and experiments with different gamma-ray sources and a gamma-ray spectrometer with a 3 x 3 NaI (Tl) detector. The pulse-height spectra were collected and the Compton edge was observed. This information was utilized to determine the relationship between the electron's mass and energy using the Compton -knee- position, the obtained results were contrasted with two collision models between photon and electron, one model was built using the classical physics and another using the special relativity theory. It was found that calculations and experiments results fit to collision model made using the special relativity. (Author)
Directory of Open Access Journals (Sweden)
Phuong Thi Vo
2015-08-01
Full Text Available Prior research identified assorted relations between trait and social cognition models of personality and engagement in physical activity. Using a representative U.S. sample (N = 957, the goal of the present study was to test two alternative structural models of the relationships among the extraversion-related facet of activity, the conscientiousness-related facet of industriousness, social cognitions from the Theory of Planned Behavior (perceived behavioral control, affective attitudes, subjective norms, intentions, Social Cognitive Theory (self-efficacy, outcome expectancies, and the Transtheoretical Model (behavioral processes of change, and engagement in physical activity. Path analyses with bootstrapping procedures were used to model direct and indirect effects of trait and social cognition constructs on physical activity through two distinct frameworks – the Theory of Planned Behavior and Neo-Socioanalytic Theory. While both models showed good internal fit, comparative model information criteria showed the Theory-of-Planned-Behavior-informed model provided a better fit. In the model, social cognitions fully mediated the relationships from the activity facet and industriousness to intentions for and engagement in physical activity, such that the relationships were primarily maintained by positive affective evaluations, positive expected outcomes, and confidence in overcoming barriers related to physical activity engagement. The resultant model – termed the Disposition-Belief-Motivation model – is proposed as a useful framework for organizing and integrating personality trait facets and social cognitions from various theoretical perspectives to investigate the expression of health-related behaviors, such as physical activity. Moreover, the results are discussed in terms of extending the application of the Disposition-Belief-Motivation model to longitudinal and intervention designs for physical activity engagement.
Vo, Phuong T.; Bogg, Tim
2015-01-01
Prior research identified assorted relations between trait and social cognition models of personality and engagement in physical activity. Using a representative U.S. sample (N = 957), the goal of the present study was to test two alternative structural models of the relationships among the extraversion-related facet of activity, the conscientiousness-related facet of industriousness, social cognitions from the Theory of Planned Behavior (perceived behavioral control, affective attitudes, subjective norms, intentions), Social Cognitive Theory (self-efficacy, outcome expectancies), and the Transtheoretical Model (behavioral processes of change), and engagement in physical activity. Path analyses with bootstrapping procedures were used to model direct and indirect effects of trait and social cognition constructs on physical activity through two distinct frameworks – the Theory of Planned Behavior and Neo-Socioanalytic Theory. While both models showed good internal fit, comparative model information criteria showed the Theory-of-Planned-Behavior-informed model provided a better fit. In the model, social cognitions fully mediated the relationships from the activity facet and industriousness to intentions for and engagement in physical activity, such that the relationships were primarily maintained by positive affective evaluations, positive expected outcomes, and confidence in overcoming barriers related to physical activity engagement. The resultant model – termed the Disposition-Belief-Motivation model– is proposed as a useful framework for organizing and integrating personality trait facets and social cognitions from various theoretical perspectives to investigate the expression of health-related behaviors, such as physical activity. Moreover, the results are discussed in terms of extending the application of the Disposition-Belief-Motivation model to longitudinal and intervention designs for physical activity engagement. PMID:26300811
Frielink, N.; Schuengel, C.; Embregts, P.J.C.M.
2018-01-01
The tenets of self-determination theory as applied to support were tested with structural equation modelling for 186 people with ID with a mild to borderline level of functioning. The results showed that (a) perceived autonomy support was positively associated with autonomous motivation and with
Frameworks for analyzing and testing theories of gravity
International Nuclear Information System (INIS)
Lee, D.L.
1974-01-01
Theoretical frameworks are presented for the analysis and testing of gravitation theories--both metric and nonmetric. For nonmetric theories, the high precision Eotvos--Dicke--Braginskii (EBD) experiments are demonstrated to be powerful tests of their gravitational coupling to electromagnetic interactions. All known nonmetric theories are ruled out to within the precision of the EDB experiments. A new metric theory of gravity is presented that cannot be distinguished from general relativity in all current and planned solar system experiments. However, this theory has very different gravitational-wave properties. Hence, the need for further tests of metric theories beyond the Parametrized Post--Newtonian formalism is pointed out and the importance of the observation of gravitational waves as a tool for testing relativistic gravity in the future is emphasized. A theory-independent formalism delineating the properties of weak, plane gravitational waves in metric theories is set up. General conservation laws that follow from variational principles in metric theories of gravity are investigated. (U.S.)
Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.
Johnson, Shane D; Groff, Elizabeth R
2014-07-01
The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.
Statistical testing of the full-range leadership theory in nursing.
Kanste, Outi; Kääriäinen, Maria; Kyngäs, Helvi
2009-12-01
The aim of this study is to test statistically the structure of the full-range leadership theory in nursing. The data were gathered by postal questionnaires from nurses and nurse leaders working in healthcare organizations in Finland. A follow-up study was performed 1 year later. The sample consisted of 601 nurses and nurse leaders, and the follow-up study had 78 respondents. Theory was tested through structural equation modelling, standard regression analysis and two-way anova. Rewarding transformational leadership seems to promote and passive laissez-faire leadership to reduce willingness to exert extra effort, perceptions of leader effectiveness and satisfaction with the leader. Active management-by-exception seems to reduce willingness to exert extra effort and perception of leader effectiveness. Rewarding transformational leadership remained as a strong explanatory factor of all outcome variables measured 1 year later. The data supported the main structure of the full-range leadership theory, lending support to the universal nature of the theory.
Dickhaus, Thorsten
2018-01-01
This textbook provides a self-contained presentation of the main concepts and methods of nonparametric statistical testing, with a particular focus on the theoretical foundations of goodness-of-fit tests, rank tests, resampling tests, and projection tests. The substitution principle is employed as a unified approach to the nonparametric test problems discussed. In addition to mathematical theory, it also includes numerous examples and computer implementations. The book is intended for advanced undergraduate, graduate, and postdoc students as well as young researchers. Readers should be familiar with the basic concepts of mathematical statistics typically covered in introductory statistics courses.
Prest, M
1988-01-01
In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module
Frielink, Noud; Schuengel, Carlo; Embregts, Petri J. C. M.
2018-01-01
The tenets of self-determination theory as applied to support were tested with structural equation modelling for 186 people with ID with a mild to borderline level of functioning. The results showed that (a) perceived autonomy support was positively associated with autonomous motivation and with satisfaction of need for autonomy, relatedness, and…
Ginis, Kathleen A Martin; Latimer, Amy E; Arbour-Nicitopoulos, Kelly P; Bassett, Rebecca L; Wolfe, Dalton L; Hanna, Steven E
2011-08-01
Little theory-based research has focused on understanding and increasing physical activity among people with physical disabilities. Testing a social cognitive theory-based model of determinants is important for identifying variables to target in physical activity-enhancing interventions. The aim of this study is to examine Social Cognitive Theory variables as predictors of physical activity among people living with spinal cord injury. Structural equation modeling was used to test a model of Social Cognitive Theory predictors of physical activity (n=160). The model explained 39% of the variance in physical activity. Self-regulation was the only significant, direct predictor. Self-regulatory efficacy and outcome expectations had indirect effects, mediated by self-regulation. Social Cognitive Theory is useful for predicting physical activity in people with spinal cord injury. Self-regulation is the most potent Social Cognitive Theory predictor of physical activity in people with spinal cord injury. Self-regulation and its determinants should be targeted in physical activity-enhancing interventions.
MACCIA, ELIZABETH S.; AND OTHERS
AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…
Using Classical Test Theory and Item Response Theory to Evaluate the LSCI
Schlingman, Wayne M.; Prather, E. E.; Collaboration of Astronomy Teaching Scholars CATS
2011-01-01
Analyzing the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI), this project uses both Classical Test Theory (CTT) and Item Response Theory (IRT) to investigate the LSCI itself in order to better understand what it is actually measuring. We use Classical Test Theory to form a framework of results that can be used to evaluate the effectiveness of individual questions at measuring differences in student understanding and provide further insight into the prior results presented from this data set. In the second phase of this research, we use Item Response Theory to form a theoretical model that generates parameters accounting for a student's ability, a question's difficulty, and estimate the level of guessing. The combined results from our investigations using both CTT and IRT are used to better understand the learning that is taking place in classrooms across the country. The analysis will also allow us to evaluate the effectiveness of individual questions and determine whether the item difficulties are appropriately matched to the abilities of the students in our data set. These results may require that some questions be revised, motivating the need for further development of the LSCI. This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Testing strong interaction theories
International Nuclear Information System (INIS)
Ellis, J.
1979-01-01
The author discusses possible tests of the current theories of the strong interaction, in particular, quantum chromodynamics. High energy e + e - interactions should provide an excellent means of studying the strong force. (W.D.L.)
A dual memory theory of the testing effect.
Rickard, Timothy C; Pan, Steven C
2017-06-05
A new theoretical framework for the testing effect-the finding that retrieval practice is usually more effective for learning than are other strategies-is proposed, the empirically supported tenet of which is that separate memories form as a consequence of study and test events. A simplest case quantitative model is derived from that framework for the case of cued recall. With no free parameters, that model predicts both proportion correct in the test condition and the magnitude of the testing effect across 10 experiments conducted in our laboratory, experiments that varied with respect to material type, retention interval, and performance in the restudy condition. The model also provides the first quantitative accounts of (a) the testing effect as a function of performance in the restudy condition, (b) the upper bound magnitude of the testing effect, (c) the effect of correct answer feedback, (d) the testing effect as a function of retention interval for the cases of feedback and no feedback, and (e) the effect of prior learning method on subsequent learning through testing. Candidate accounts of several other core phenomena in the literature, including test-potentiated learning, recognition versus cued recall training effects, cued versus free recall final test effects, and other select transfer effects, are also proposed. Future prospects and relations to other theories are discussed.
DEFF Research Database (Denmark)
Pedersen, Rasmus Søndergaard; Rahbek, Anders
2017-01-01
We present novel theory for testing for reduction of GARCH-X type models with an exogenous (X) covariate to standard GARCH type models. To deal with the problems of potential nuisance parameters on the boundary of the parameter space as well as lack of identification under the null, we exploit...... a noticeable property of specific zero-entries in the inverse information of the GARCH-X type models. Specifically, we consider sequential testing based on two likelihood ratio tests and as demonstrated the structure of the inverse information implies that the proposed test neither depends on whether...... the nuisance parameters lie on the boundary of the parameter space, nor on lack of identification. Our general results on GARCH-X type models are applied to Gaussian based GARCH-X models, GARCH-X models with Student's t-distributed innovations as well as the integer-valued GARCH-X (PAR-X) models....
New Pathways between Group Theory and Model Theory
Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz
2017-01-01
This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...
Parametrized tests of post-Newtonian theory using Advanced LIGO and Einstein Telescope
International Nuclear Information System (INIS)
Mishra, Chandra Kant; Arun, K. G.; Iyer, Bala R.; Sathyaprakash, B. S.
2010-01-01
General relativity has very specific predictions for the gravitational waveforms from inspiralling compact binaries obtained using the post-Newtonian (PN) approximation. We investigate the extent to which the measurement of the PN coefficients, possible with the second generation gravitational-wave detectors such as the Advanced Laser Interferometer Gravitational-Wave Observatory (LIGO) and the third generation gravitational-wave detectors such as the Einstein Telescope (ET), could be used to test post-Newtonian theory and to put bounds on a subclass of parametrized-post-Einstein theories which differ from general relativity in a parametrized sense. We demonstrate this possibility by employing the best inspiralling waveform model for nonspinning compact binaries which is 3.5PN accurate in phase and 3PN in amplitude. Within the class of theories considered, Advanced LIGO can test the theory at 1.5PN and thus the leading tail term. Future observations of stellar mass black hole binaries by ET can test the consistency between the various PN coefficients in the gravitational-wave phasing over the mass range of 11-44M · . The choice of the lower frequency cutoff is important for testing post-Newtonian theory using the ET. The bias in the test arising from the assumption of nonspinning binaries is indicated.
Tests of two convection theories for red giant and red supergiant envelopes
Stothers, Richard B.; Chin, Chao-Wen
1995-01-01
Two theories of stellar envelope convection are considered here in the context of red giants and red supergiants of intermediate to high mass: Boehm-Vitense's standard mixing-length theory (MLT) and Canuto & Mazzitelli's new theory incorporating the full spectrum of turbulence (FST). Both theories assume incompressible convection. Two formulations of the convective mixing length are also evaluated: l proportional to the local pressure scale height (H(sub P)) and l proportional to the distance from the upper boundary of the convection zone (z). Applications to test both theories are made by calculating stellar evolutionary sequences into the red zone (z). Applications to test both theories are made by calculating stellar evolutionary sequences into the red phase of core helium burning. Since the theoretically predicted effective temperatures for cool stars are known to be sensitive to the assigned value of the mixing length, this quantity has been individually calibrated for each evolutionary sequence. The calibration is done in a composite Hertzsprung-Russell diagram for the red giant and red supergiant members of well-observed Galactic open clusters. The MLT model requires the constant of proportionality for the convective mixing length to vary by a small but statistically significant amount with stellar mass, whereas the FST model succeeds in all cases with the mixing lenghth simply set equal to z. The structure of the deep stellar interior, however, remains very nearly unaffected by the choices of convection theory and mixing lenghth. Inside the convective envelope itself, a density inversion always occurs, but is somewhat smaller for the convectively more efficient MLT model. On physical grounds the FST model is preferable, and seems to alleviate the problem of finding the proper mixing length.
Lin, Bih-Jiau; Chiou, Wen-Bin
2010-06-01
English competency has become essential for obtaining a better job or succeeding in higher education in Taiwan. Thus, passing the General English Proficiency Test is important for college students in Taiwan. The current study applied Ajzen's theory of planned behavior and the notions of outcome expectancy and self-efficacy from Bandura's social cognitive theory to investigate college students' intentions to take the General English Proficiency Test. The formal sample consisted of 425 undergraduates (217 women, 208 men; M age = 19.5 yr., SD = 1.3). The theory of planned behavior showed greater predictive ability (R2 = 33%) of intention than the social cognitive theory (R2 = 7%) in regression analysis and made a unique contribution to prediction of actual test-taking behavior one year later in logistic regression. Within-model analyses indicated that subjective norm in theory of planned behavior and outcome expectancy in social cognitive theory are crucial factors in predicting intention. Implications for enhancing undergraduates' intentions to take the English proficiency test are discussed.
Riles, K
1998-01-01
The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.
Tests of the electroweak theory at LEP
International Nuclear Information System (INIS)
Schaile, D.
1994-01-01
LEP offers a rich choice of tests of the electroweak theory such as the measurement of hadronic and leptonic cross sections, leptonic forward-backward asymmetries, τ polarization asymmetries, partial widths and forward-backward asymmetries of heavy quark flavours, of the inclusive q anti q charge asymmetry and of final state radiation in hadronic events. We discuss experimental aspects of these measurements and their theoretical parametrization and summarize the results available so far. We present several analyses which reveal specific aspects of the results, such as their constraints on Standard Model parameters and on new particles, the sensitivity to deviations from the Standard Model multiplet structure and an analysis in a framework which provides a model independent search for new physics. (orig.)
Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene
2015-05-01
In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.
Testing adaptive toolbox models: a Bayesian hierarchical approach.
Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.
Test of atomic theory by photoelectron spectrometry with synchrotron radiation
International Nuclear Information System (INIS)
Krause, M.O.
1984-01-01
The successful combination of synchrotron radiation with electron spectrometry, accomplished at Daresbury, England and Orsay, France, made it possible to investigate sigma/sub x/ and β/sub x/ continuously over the very soft x-ray or the uv range of photon energies. The detailed and highly differentiated data resulting from this advanced experimentation put theory to a stringent test. In the interplay between theory and experiment, sophisticated Hartree Fock (HF) based models were developed which included both relativistic and many-electron effects. These theoretical models have provided us with a better insight than previously possible into the physics of the photon-atom interaction and the electronic structure and dynamics of atoms. However, critical experiments continue to be important for further improvements of theory. A number of such experiments are discussed in this presentation. The dynamic properties determined in these studies include in addition to sigma/sub x/ and β/sub x/ the spin polarization parameters. As a result the comparison between theory and experiment becomes rigorous, detailed and comprehensive. 46 references, 6 figures
SLAC physicists develop test for string theory
Yajnik, Juhi
2006-01-01
"Under certain conditions, string theory solves many of the questions wracking the minds of physicists, but until recently it had one major flaw - it could not be tested. SLAC (Stanford Linear Accelerator Center) scientists have found a way to test this revolutionary theory, which posits that there are 10 or 11 dimensions in our universe" (1 page)
A person fit test for IRT models for polytomous items
Glas, Cornelis A.W.; Dagohoy, A.V.
2007-01-01
A person fit test based on the Lagrange multiplier test is presented for three item response theory models for polytomous items: the generalized partial credit model, the sequential model, and the graded response model. The test can also be used in the framework of multidimensional ability
Vaughn, Leigh Ann
2017-03-01
This article introduces the need-support model, which proposes that regulatory focus can affect subjective support for the needs proposed by self-determination theory (autonomy, competence, and relatedness), and support of these needs can affect subjective labeling of experiences as promotion-focused and prevention-focused. Three studies tested these hypotheses ( N = 2,114). Study 1 found that people recall more need support in promotion-focused experiences than in prevention-focused experiences, and need support in their day yesterday (with no particular regulatory focus) fell in between. Study 2 found that experiences of higher need support were more likely to be labeled as promotion-focused rather than prevention-focused, and that each need accounted for distinct variance in the labeling of experiences. Study 3 varied regulatory focus within a performance task and found that participants in the promotion condition engaged in need-support inflation, whereas participants in the prevention condition engaged in need-support deflation. Directions for future research are discussed.
Explaining the black-white gap in cognitive test scores: Toward a theory of adverse impact.
Cottrell, Jonathan M; Newman, Daniel A; Roisman, Glenn I
2015-11-01
In understanding the causes of adverse impact, a key parameter is the Black-White difference in cognitive test scores. To advance theory on why Black-White cognitive ability/knowledge test score gaps exist, and on how these gaps develop over time, the current article proposes an inductive explanatory model derived from past empirical findings. According to this theoretical model, Black-White group mean differences in cognitive test scores arise from the following racially disparate conditions: family income, maternal education, maternal verbal ability/knowledge, learning materials in the home, parenting factors (maternal sensitivity, maternal warmth and acceptance, and safe physical environment), child birth order, and child birth weight. Results from a 5-wave longitudinal growth model estimated on children in the NICHD Study of Early Child Care and Youth Development from ages 4 through 15 years show significant Black-White cognitive test score gaps throughout early development that did not grow significantly over time (i.e., significant intercept differences, but not slope differences). Importantly, the racially disparate conditions listed above can account for the relation between race and cognitive test scores. We propose a parsimonious 3-Step Model that explains how cognitive test score gaps arise, in which race relates to maternal disadvantage, which in turn relates to parenting factors, which in turn relate to cognitive test scores. This model and results offer to fill a need for theory on the etiology of the Black-White ethnic group gap in cognitive test scores, and attempt to address a missing link in the theory of adverse impact. (c) 2015 APA, all rights reserved).
An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test
Kahraman, Nilüfer
2014-01-01
Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local…
An empirical comparison of Item Response Theory and Classical Test Theory
Directory of Open Access Journals (Sweden)
Špela Progar
2008-11-01
Full Text Available Based on nonlinear models between the measured latent variable and the item response, item response theory (IRT enables independent estimation of item and person parameters and local estimation of measurement error. These properties of IRT are also the main theoretical advantages of IRT over classical test theory (CTT. Empirical evidence, however, often failed to discover consistent differences between IRT and CTT parameters and between invariance measures of CTT and IRT parameter estimates. In this empirical study a real data set from the Third International Mathematics and Science Study (TIMSS 1995 was used to address the following questions: (1 How comparable are CTT and IRT based item and person parameters? (2 How invariant are CTT and IRT based item parameters across different participant groups? (3 How invariant are CTT and IRT based item and person parameters across different item sets? The findings indicate that the CTT and the IRT item/person parameters are very comparable, that the CTT and the IRT item parameters show similar invariance property when estimated across different groups of participants, that the IRT person parameters are more invariant across different item sets, and that the CTT item parameters are at least as much invariant in different item sets as the IRT item parameters. The results furthermore demonstrate that, with regards to the invariance property, IRT item/person parameters are in general empirically superior to CTT parameters, but only if the appropriate IRT model is used for modelling the data.
Rakkapao, Suttida; Prasitpong, Singha; Arayathanitkul, Kwan
2016-01-01
This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming…
Help-Seeking Decisions of Battered Women: A Test of Learned Helplessness and Two Stress Theories.
Wauchope, Barbara A.
This study tested the learned helplessness theory, stress theory, and a modified stress theory to determine the best model for predicting the probability that a woman would seek help when she experienced severe violence from a male partner. The probability was hypothesized to increase as the stress of the violence experienced increased. Data were…
2 + 1 quantum gravity as a toy model for the 3 + 1 theory
International Nuclear Information System (INIS)
Ashtekar, A.; Husain, V.; Smolin, L.; Samuel, J.; Utah Univ., Salt Lake City, UT
1989-01-01
2 + 1 Einstein gravity is used as a toy model for testing a program for non-perturbative canonical quantisation of the 3 + 1 theory. The program can be successfully implemented in the model and leads to a surprisingly rich quantum theory. (author)
Assessing difference between classical test theory and item ...
African Journals Online (AJOL)
Assessing difference between classical test theory and item response theory methods in scoring primary four multiple choice objective test items. ... All research participants were ranked on the CTT number correct scores and the corresponding IRT item pattern scores from their performance on the PRISMADAT. Wilcoxon ...
Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence
Energy Technology Data Exchange (ETDEWEB)
Heusen, M.; Shalchi, A., E-mail: husseinm@myumanitoba.ca, E-mail: andreasm4@yahoo.com [Department of Physics and Astronomy, University of Manitoba, Winnipeg, MB R3T 2N2 (Canada)
2017-04-20
In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.
Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence
International Nuclear Information System (INIS)
Heusen, M.; Shalchi, A.
2017-01-01
In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.
New tests of cumulative prospect theory and the priority heuristic
Directory of Open Access Journals (Sweden)
Michael H. Birnbaum
2008-04-01
Full Text Available Previous tests of cumulative prospect theory (CPT and of the priority heuristic (PH found evidence contradicting these two models of risky decision making. However, those tests were criticized because they had characteristics that might ``trigger'' use of other heuristics. This paper presents new tests that avoid those characteristics. Expected values of the gambles are nearly equal in each choice. In addition, if a person followed expected value (EV, expected utility (EU, CPT, or PH in these tests, she would shift her preferences in the same direction as shifts in EV or EU. In contrast, the transfer of attention exchange model (TAX and a similarity model predict that people will reverse preferences in the opposite direction. Results contradict the PH, even when PH is modified to include a preliminary similarity evaluation using the PH parameters. New tests of probability-consequence interaction were also conducted. Strong interactions were observed, contrary to PH. These results add to the growing bodies of evidence showing that neither CPT nor PH is an accurate description of risky decision making.
SIRU utilization. Volume 1: Theory, development and test evaluation
Musoff, H.
1974-01-01
The theory, development, and test evaluations of the Strapdown Inertial Reference Unit (SIRU) are discussed. The statistical failure detection and isolation, single position calibration, and self alignment techniques are emphasized. Circuit diagrams of the system components are provided. Mathematical models are developed to show the performance characteristics of the subsystems. Specific areas of the utilization program are identified as: (1) error source propagation characteristics and (2) local level navigation performance demonstrations.
Latent Trait Theory Applications to Test Item Bias Methodology. Research Memorandum No. 1.
Osterlind, Steven J.; Martois, John S.
This study discusses latent trait theory applications to test item bias methodology. A real data set is used in describing the rationale and application of the Rasch probabilistic model item calibrations across various ethnic group populations. A high school graduation proficiency test covering reading comprehension, writing mechanics, and…
How Often Is the Misfit of Item Response Theory Models Practically Significant?
Sinharay, Sandip; Haberman, Shelby J.
2014-01-01
Standard 3.9 of the Standards for Educational and Psychological Testing ([, 1999]) demands evidence of model fit when item response theory (IRT) models are employed to data from tests. Hambleton and Han ([Hambleton, R. K., 2005]) and Sinharay ([Sinharay, S., 2005]) recommended the assessment of practical significance of misfit of IRT models, but…
Hong, Quan Nha; Coutu, Marie-France; Berbiche, Djamal
2017-01-01
The Work Role Functioning Questionnaire (WRFQ) was developed to assess workers' perceived ability to perform job demands and is used to monitor presenteeism. Still few studies on its validity can be found in the literature. The purpose of this study was to assess the items and factorial composition of the Canadian French version of the WRFQ (WRFQ-CF). Two measurement approaches were used to test the WRFQ-CF: Classical Test Theory (CTT) and non-parametric Item Response Theory (IRT). A total of 352 completed questionnaires were analyzed. A four-factor and three-factor model models were tested and shown respectively good fit with 14 items (Root Mean Square Error of Approximation (RMSEA) = 0.06, Standardized Root Mean Square Residual (SRMR) = 0.04, Bentler Comparative Fit Index (CFI) = 0.98) and with 17 items (RMSEA = 0.059, SRMR = 0.048, CFI = 0.98). Using IRT, 13 problematic items were identified, of which 9 were common with CTT. This study tested different models with fewer problematic items found in a three-factor model. Using a non-parametric IRT and CTT for item purification gave complementary results. IRT is still scarcely used and can be an interesting alternative method to enhance the quality of a measurement instrument. More studies are needed on the WRFQ-CF to refine its items and factorial composition.
Gender, general theory of crime and computer crime: an empirical test.
Moon, Byongook; McCluskey, John D; McCluskey, Cynthia P; Lee, Sangwon
2013-04-01
Regarding the gender gap in computer crime, studies consistently indicate that boys are more likely than girls to engage in various types of computer crime; however, few studies have examined the extent to which traditional criminology theories account for gender differences in computer crime and the applicability of these theories in explaining computer crime across gender. Using a panel of 2,751 Korean youths, the current study tests the applicability of the general theory of crime in explaining the gender gap in computer crime and assesses the theory's utility in explaining computer crime across gender. Analyses show that self-control theory performs well in predicting illegal use of others' resident registration number (RRN) online for both boys and girls, as predicted by the theory. However, low self-control, a dominant criminogenic factor in the theory, fails to mediate the relationship between gender and computer crime and is inadequate in explaining illegal downloading of software in both boy and girl models. Theoretical implication of the findings and the directions for future research are discussed.
Model Theory in Algebra, Analysis and Arithmetic
Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J
2014-01-01
Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.
Methods for testing transport models
International Nuclear Information System (INIS)
Singer, C.; Cox, D.
1991-01-01
Substantial progress has been made over the past year on six aspects of the work supported by this grant. As a result, we have in hand for the first time a fairly complete set of transport models and improved statistical methods for testing them against large databases. We also have initial results of such tests. These results indicate that careful application of presently available transport theories can reasonably well produce a remarkably wide variety of tokamak data
Implementation of an Improved Adaptive Testing Theory
Al-A'ali, Mansoor
2007-01-01
Computer adaptive testing is the study of scoring tests and questions based on assumptions concerning the mathematical relationship between examinees' ability and the examinees' responses. Adaptive student tests, which are based on item response theory (IRT), have many advantages over conventional tests. We use the least square method, a…
Cosmological tests of a scale covariant theory of gravitation
International Nuclear Information System (INIS)
Owen, J.R.
1979-01-01
The Friedmann models with #betta# = 0 are subjected to several optical and radio tests within the standard and scale covariant theories of gravitation. Within standard cosmology, both interferometric and scintillation data are interpreted in terms of selection effects and evolution. Within the context of scale covariant cosmology are derived: (1) the full solution to Einstein's gravitational equations in atomic units for a matter dominated universe, (2) the study of the magnitude vs. redshift relation for elliptical galaxies, (3) the derivation of the evolutionary parameter used in (2), (4) the isophotal angular diameter vs. redshift relation, (5) the metric angular diameter vs. redshift relation, (6) the N(m) vs. magnitude relation for QSO's and their m vs z relation, and finally (7) the integrated and differential expressions for the number count vs. radio flux test. The results, both in graphical and tabular form, are presented for four gauges (i.e. parametrized relations between atomic and gravitational units). No contradiction between the new theory and the data is found with any of the tests studied. For some gauges, which are suggested by a recent analysis of the time variation of the Moon's period which is discussed in the text in terms of the new theory, the effect of the deceleration parameter on cosmological predictions is enhanced over standard cosmology and it is possible to say that the data are more easily reconciled with an open universe. Within the same gauge, the main features of both the N(m) vs. m and m-z test are accounted for by the same simple evolutionary parametrization whereas different evolutionary rates were indicated by interpretation within standard cosmology. The same consistency, lacking in standard cosmology on this level of analysis, is achieved for the integrated and differential number count - radio flux tests within the same gauge
Testing static tradeoff theiry against pecking order models of capital ...
African Journals Online (AJOL)
We test two models with the purpose of finding the best empirical explanation for corporate financing choice of a cross section of 27 Nigerian quoted companies. The models were developed to represent the Static tradeoff Theory and the Pecking order Theory of capital structure with a view to make comparison between ...
A Model of Statistics Performance Based on Achievement Goal Theory.
Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.
2003-01-01
Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…
Field theory and the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Dudas, E [Orsay, LPT (France)
2014-07-01
This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.
Xu, Jian
2017-01-01
The present study investigated test-taking motivation in L2 listening testing context by applying Expectancy-Value Theory as the framework. Specifically, this study was intended to examine the complex relationships among expectancy, importance, interest, listening anxiety, listening metacognitive awareness, and listening test score using data from a large-scale and high-stakes language test among Chinese first-year undergraduates. Structural equation modeling was used to examine the mediating...
Testing alternative theories of dark matter with the CMB
International Nuclear Information System (INIS)
Li Baojiu; Barrow, John D.; Mota, David F.; Zhao, HongSheng
2008-01-01
We propose a method to study and constrain modified gravity theories for dark matter using CMB temperature anisotropies and polarization. We assume that the theories considered here have already passed the matter power-spectrum test of large-scale structure. With this requirement met, we show that a modified gravity theory can be specified by parametrizing the time evolution of its dark-matter density contrast, which is completely controlled by the dark-matter stress history. We calculate how the stress history with a given parametrization affects the CMB observables, and a qualitative discussion of the physical effects involved is supplemented with numerical examples. It is found that, in general, alternative gravity theories can be efficiently constrained by the CMB temperature and polarization spectra. There exist, however, special cases where modified gravity cannot be distinguished from the CDM model even by using both CMB and matter power spectrum observations, nor can they be efficiently restricted by other observables in perturbed cosmologies. Our results show how the stress properties of dark matter, which determine the evolutions of both density perturbations and the gravitational potential, can be effectively investigated using just the general conservation equations and without assuming any specific theoretical gravitational theory within a wide class.
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Blake, John; Yaghmaian, Rana; Brooks, Jessica; Fais, Connor; Chan, Fong
2018-05-01
The aim of the study was to test an expanded model of Snyder's hope theory for prediction of participation for individuals with spinal cord injury (SCI). Statistical model testing focused on evaluation of hope theory constructs (i.e., agency thoughts and pathways thoughts) as serial mediators of relationships between attachment and community participation. Quantitative, cross-sectional, descriptive design using multiple regression and correlational techniques. The sample comprised 108 persons with SCI recruited from spinal cord injury advocacy organizations in the United States, the United Kingdom, and Canada. Secure attachment, avoidant attachment, anxious attachment, and the hope constructs were significantly related to participation. Significant mediational effects were observed when agency thoughts and pathways thoughts were specified as mediators in series between attachment and community participation for people with SCI (i.e., agency specified as M1 and pathways specified as M2). Results provide support for Snyder's theoretical conceptualization and the use of hope-based interventions by rehabilitation practitioners for improving global participation outcomes for people with SCI who experience attachment-related difficulties. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Lattice models and conformal field theories
International Nuclear Information System (INIS)
Saleur, H.
1988-01-01
Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied
Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D
2014-05-01
The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
Testing the causal theory of reference.
Domaneschi, Filippo; Vignolo, Massimiliano; Di Paola, Simona
2017-04-01
Theories of reference are a crucial research topic in analytic philosophy. Since the publication of Kripke's Naming and Necessity, most philosophers have endorsed the causal/historical theory of reference. The goal of this paper is twofold: (i) to discuss a method for testing experimentally the causal theory of reference for proper names by investigating linguistic usage and (ii) to present the results from two experiments conducted with that method. Data collected in our experiments confirm the causal theory of reference for people proper names and for geographical proper names. A secondary but interesting result is that the semantic domain affects reference assignment: while with people proper names speakers tend to assign the semantic reference, with geographical proper names they are prompted to assign the speaker's reference. Copyright © 2016 Elsevier B.V. All rights reserved.
Advances in the application of decision theory to test-based decision making
van der Linden, Willem J.
This paper reviews recent research in the Netherlands on the application of decision theory to test-based decision making about personnel selection and student placement. The review is based on an earlier model proposed for the classification of decision problems, and emphasizes an empirical
An evaluation of iced bridge hanger vibrations through wind tunnel testing and quasi-steady theory
DEFF Research Database (Denmark)
Gjelstrup, Henrik; Georgakis, Christos T.; Larsen, A.
2012-01-01
roughness is also examined. The static force coefficients are used to predict parameter regions where aerodynamic instability of the iced bridge hanger might be expected to occur, through use of an adapted theoretical 3- DOF quasi-steady galloping instability model, which accounts for sectional axial...... rotation. A comparison between the 3-DOF model and the instabilities found through two degree-of-freedom (2-DOF) dynamic tests is presented. It is shown that, although there is good agreement between the instabilities found through use of the quasi-steady theory and the dynamic tests, discrepancies exist......-indicating the possible inability of quasi-steady theory to fully predict these vibrational instabilities....
Theory for the three-dimensional Mercedes-Benz model of water
Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.
2009-11-01
The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.
Theory for the three-dimensional Mercedes-Benz model of water.
Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A
2009-11-21
The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.
Comparison of Classical Test Theory and Item Response Theory in Individual Change Assessment
Jabrayilov, Ruslan; Emons, Wilco H. M.; Sijtsma, Klaas
2016-01-01
Clinical psychologists are advised to assess clinical and statistical significance when assessing change in individual patients. Individual change assessment can be conducted using either the methodologies of classical test theory (CTT) or item response theory (IRT). Researchers have been optimistic
Theory Testing Using Case Studies
DEFF Research Database (Denmark)
Sørensen, Pernille Dissing; Løkke, Ann-Kristina
2006-01-01
design. Finally, we discuss the epistemological logic, i.e., the value to larger research programmes, of such studies and, following Lakatos, conclude that the value of theory-testing case studies lies beyond naïve falsification and in their contribution to developing research programmes in a progressive...
A test theory of special relativity
International Nuclear Information System (INIS)
Mansouri, R.; Sexl, R.U.
1977-01-01
Various second-order optical tests of special relativity are discussed within the framework of a test theory developed previously. Owing to the low accuracy of the Kennedy-Thorndike experiment, the Lorentz contraction is known by direct experiments only to an accuracy of a few percent. To improve this accuracy several experiments are suggested. (author)
Numerical Test of Different Approximations Used in the Transport Theory of Energetic Particles
Qin, G.; Shalchi, A.
2016-05-01
Recently developed theories for perpendicular diffusion work remarkably well. The diffusion coefficients they provide agree with test-particle simulations performed for different turbulence setups ranging from slab and slab-like models to two-dimensional and noisy reduced MHD turbulence. However, such theories are still based on different analytical approximations. In the current paper we use a test-particle code to explore the different approximations used in diffusion theory. We benchmark different guiding center approximations, simplifications of higher-order correlations, and the Taylor-Green-Kubo formula. We demonstrate that guiding center approximations work very well as long as the particle's unperturbed Larmor radius is smaller than the perpendicular correlation length of the turbulence. Furthermore, the Taylor-Green-Kubo formula and the definition of perpendicular diffusion coefficients via mean square displacements provide the same results. The only approximation that was used in the past in nonlinear diffusion theory that fails is to replace fourth-order correlations by a product of two second-order correlation functions. In more advanced nonlinear theories, however, this type of approximation is no longer used. Therefore, we confirm the validity of modern diffusion theories as a result of the work presented in the current paper.
Perla, Rocco J.; Carifio, James
2011-01-01
Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…
A Proposed Model of Jazz Theory Knowledge Acquisition
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
Experiences gained in testing a theory for modelling groundwater flow in heterogeneous media
DEFF Research Database (Denmark)
Christensen, Steen; R.L., Cooley
2003-01-01
Usually, small-scale model error is present in groundwater modelling because the model only represents average system characteristics having the same form as the drift and small-scale variability is neglected. These errors cause the true errors of a regression model to be correlated. Theory...... and an example show that the errors also contribute to bias in the estimates of model parameters. This bias originates from model nonlinearity. In spite of this bias, predictions of hydraulic head are nearly unbiased if the model intrinsic nonlinearity is small. Individual confidence and prediction intervals...... are accurate if the t-statistic is multiplied by a correction factor. The correction factor can be computed from the true error second moment matrix, which can be determined when the stochastic properties of the system characteristics are known....
Testing Modified Gravity Theories via Wide Binaries and GAIA
Pittordis, Charalambos; Sutherland, Will
2018-06-01
The standard ΛCDM model based on General Relativity (GR) including cold dark matter (CDM) is very successful at fitting cosmological observations, but recent non-detections of candidate dark matter (DM) particles mean that various modified-gravity theories remain of significant interest. The latter generally involve modifications to GR below a critical acceleration scale ˜10-10 m s-2. Wide-binary (WB) star systems with separations ≳ 5 kAU provide an interesting test for modified gravity, due to being in or near the low-acceleration regime and presumably containing negligible DM. Here, we explore the prospects for new observations pending from the GAIA spacecraft to provide tests of GR against MOND or TeVes-like theories in a regime only partially explored to date. In particular, we find that a histogram of (3D) binary relative velocities, relative to equilibrium circular velocity predicted from the (2D) projected separation predicts a rather sharp feature in this distribution for standard gravity, with an 80th (90th) percentile value close to 1.025 (1.14) with rather weak dependence on the eccentricity distribution. However, MOND/TeVeS theories produce a shifted distribution, with a significant increase in these upper percentiles. In MOND-like theories without an external field effect, there are large shifts of order unity. With the external field effect included, the shifts are considerably reduced to ˜0.04 - 0.08, but are still potentially detectable statistically given reasonably large samples and good control of contaminants. In principle, followup of GAIA-selected wide binaries with ground-based radial velocities accurate to ≲ 0.03 { km s^{-1}} should be able to produce an interesting new constraint on modified-gravity theories.
Gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Witten, E.
1989-01-01
Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)
The Effects of Test Length and Sample Size on Item Parameters in Item Response Theory
Sahin, Alper; Anil, Duygu
2017-01-01
This study investigates the effects of sample size and test length on item-parameter estimation in test development utilizing three unidimensional dichotomous models of item response theory (IRT). For this purpose, a real language test comprised of 50 items was administered to 6,288 students. Data from this test was used to obtain data sets of…
The Friction Theory for Viscosity Modeling
DEFF Research Database (Denmark)
Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan
2001-01-01
, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Leventhal, Brian C.; Stone, Clement A.
2018-01-01
Interest in Bayesian analysis of item response theory (IRT) models has grown tremendously due to the appeal of the paradigm among psychometricians, advantages of these methods when analyzing complex models, and availability of general-purpose software. Possible models include models which reflect multidimensionality due to designed test structure,…
Quiver gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Yagi, Junya
2015-01-01
We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.
Economic Modelling in Institutional Economic Theory
Directory of Open Access Journals (Sweden)
Wadim Strielkowski
2017-06-01
Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.
Kochukhov, O.; Ryabchikova, T. A.
2018-02-01
A series of recent theoretical atomic diffusion studies has address the challenging problem of predicting inhomogeneous vertical and horizontal chemical element distributions in the atmospheres of magnetic ApBp stars. Here we critically assess the most sophisticated of such diffusion models - based on a time-dependent treatment of the atomic diffusion in a magnetized stellar atmosphere - by direct comparison with observations as well by testing the widely used surface mapping tools with the spectral line profiles predicted by this theory. We show that the mean abundances of Fe and Cr are grossly underestimated by the time-dependent theoretical diffusion model, with discrepancies reaching a factor of 1000 for Cr. We also demonstrate that Doppler imaging inversion codes, based either on modelling of individual metal lines or line-averaged profiles simulated according to theoretical three-dimensional abundance distribution, are able to reconstruct correct horizontal chemical spot maps despite ignoring the vertical abundance variation. These numerical experiments justify a direct comparison of the empirical two-dimensional Doppler maps with theoretical diffusion calculations. This comparison is generally unfavourable for the current diffusion theory, as very few chemical elements are observed to form overabundance rings in the horizontal field regions as predicted by the theory and there are numerous examples of element accumulations in the vicinity of radial field zones, which cannot be explained by diffusion calculations.
Testing rank-dependent utility theory for health outcomes.
Oliver, Adam
2003-10-01
Systematic violations of expected utility theory (EU) have been reported in the context of both money and health outcomes. Rank-dependent utility theory (RDU) is currently the most popular and influential alternative theory of choice under circumstances of risk. This paper reports a test of the descriptive performance of RDU compared to EU in the context of health. When one of the options is certain, violations of EU that can be explained by RDU are found. When both options are risky, no evidence that RDU is a descriptive improvement over EU is found, though this finding may be due to the low power of the tests. Copyright 2002 John Wiley & Sons, Ltd.
The individual-oriented and social-oriented Chinese bicultural self: testing the theory.
Lu, Luo
2008-06-01
The author proposes a bicultural self theory for contemporary Chinese individuals, encompassing 2 main components: the individual-oriented self and the social-oriented self. The social orientation is rooted in traditional Chinese conceptualization of the self, whereas the individual orientation has evolved and developed under Western influences along with recent societal modernization. The author conducted a series of 5 studies to test the theory and relate the model to important issues in current personality and social psychological research, such as cultural individualism-collectivism, self-construals, motivation, cognition, emotion, and well-being. A total of 977 university students in Taiwan participated. The author found that contrasting self-aspects were differentially associated with the aforementioned constructs, as theoretically predicted. This evidence thus generally supported the bicultural self model.
Warped models in string theory
International Nuclear Information System (INIS)
Acharya, B.S.; Benini, F.; Valandro, R.
2006-12-01
Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)
Early Tests of Piagetian Theory Through World War II.
Beins, Bernard C
2016-01-01
Psychologists recognized the importance of Jean Piaget's theory from its inception. Within a year of the appearance of his first book translated into English, The Language and Thought of the Child (J. Piaget, 1926) , it had been reviewed and welcomed; shortly thereafter, psychologists began testing the tenets of the theory empirically. The author traces the empirical testing of his theory in the 2 decades following publication of his initial book. A review of the published literature through the World War II era reveals that the research resulted in consistent failure to support the theoretical mechanisms that Piaget proposed. Nonetheless, the theory ultimately gained traction to become the bedrock of developmental psychology. Reasons for its persistence may include a possible lack of awareness by psychologists about the lack of empirical support, its breadth and complexity, and a lack of a viable alternate theory. As a result, the theory still exerts influence in psychology even though its dominance has diminished.
Directory of Open Access Journals (Sweden)
Chen Song
Full Text Available The macroscopic Nernst-Planck (NP theory has often been used for predicting ion channel currents in recent years, but the validity of this theory at the microscopic scale has not been tested. In this study we systematically tested the ability of the NP theory to accurately predict channel currents by combining and comparing the results with those of Brownian dynamics (BD simulations. To thoroughly test the theory in a range of situations, calculations were made in a series of simplified cylindrical channels with radii ranging from 3 to 15 Å, in a more complex 'catenary' channel, and in a realistic model of the mechanosensitive channel MscS. The extensive tests indicate that the NP equation is applicable in narrow ion channels provided that accurate concentrations and potentials can be input as the currents obtained from the combination of BD and NP match well with those obtained directly from BD simulations, although some discrepancies are seen when the ion concentrations are not radially uniform. This finding opens a door to utilising the results of microscopic simulations in continuum theory, something that is likely to be useful in the investigation of a range of biophysical and nano-scale applications and should stimulate further studies in this direction.
Song, Chen; Corry, Ben
2011-01-01
The macroscopic Nernst-Planck (NP) theory has often been used for predicting ion channel currents in recent years, but the validity of this theory at the microscopic scale has not been tested. In this study we systematically tested the ability of the NP theory to accurately predict channel currents by combining and comparing the results with those of Brownian dynamics (BD) simulations. To thoroughly test the theory in a range of situations, calculations were made in a series of simplified cylindrical channels with radii ranging from 3 to 15 Å, in a more complex 'catenary' channel, and in a realistic model of the mechanosensitive channel MscS. The extensive tests indicate that the NP equation is applicable in narrow ion channels provided that accurate concentrations and potentials can be input as the currents obtained from the combination of BD and NP match well with those obtained directly from BD simulations, although some discrepancies are seen when the ion concentrations are not radially uniform. This finding opens a door to utilising the results of microscopic simulations in continuum theory, something that is likely to be useful in the investigation of a range of biophysical and nano-scale applications and should stimulate further studies in this direction.
An elastic-visco-plastic damage model: from theory to application
International Nuclear Information System (INIS)
Wang, X.C.; Habraken, A.M.
1996-01-01
An energy-based two-variable damage theory is applied to Bodner's model. It gives an elastic-viscoplastic damage model. Some theoretical details are described in this paper. The parameters identification procedure is discussed and a complete set of parameters for an aluminium is presented. Numerical modelling of the laboratory tests are used to validate the model. An industrial aeronautic rod fabrication process is simulated and some numerical results are presented in this paper. (orig.)
Oliveira, Arnaldo
2007-01-01
This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.
Methods for testing transport models
International Nuclear Information System (INIS)
Singer, C.; Cox, D.
1993-01-01
This report documents progress to date under a three-year contract for developing ''Methods for Testing Transport Models.'' The work described includes (1) choice of best methods for producing ''code emulators'' for analysis of very large global energy confinement databases, (2) recent applications of stratified regressions for treating individual measurement errors as well as calibration/modeling errors randomly distributed across various tokamaks, (3) Bayesian methods for utilizing prior information due to previous empirical and/or theoretical analyses, (4) extension of code emulator methodology to profile data, (5) application of nonlinear least squares estimators to simulation of profile data, (6) development of more sophisticated statistical methods for handling profile data, (7) acquisition of a much larger experimental database, and (8) extensive exploratory simulation work on a large variety of discharges using recently improved models for transport theories and boundary conditions. From all of this work, it has been possible to define a complete methodology for testing new sets of reference transport models against much larger multi-institutional databases
On low rank classical groups in string theory, gauge theory and matrix models
International Nuclear Information System (INIS)
Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun
2004-01-01
We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature
Powell, Rachael; Pattison, Helen M; Francis, Jill J
2016-01-01
Chlamydia is a common sexually transmitted infection that has potentially serious consequences unless detected and treated early. The health service in the UK offers clinic-based testing for chlamydia but uptake is low. Identifying the predictors of testing behaviours may inform interventions to increase uptake. Self-tests for chlamydia may facilitate testing and treatment in people who avoid clinic-based testing. Self-testing and being tested by a health care professional (HCP) involve two contrasting contexts that may influence testing behaviour. However, little is known about how predictors of behaviour differ as a function of context. In this study, theoretical models of behaviour were used to assess factors that may predict intention to test in two different contexts: self-testing and being tested by a HCP. Individuals searching for or reading about chlamydia testing online were recruited using Google Adwords. Participants completed an online questionnaire that addressed previous testing behaviour and measured constructs of the Theory of Planned Behaviour and Protection Motivation Theory, which propose a total of eight possible predictors of intention. The questionnaire was completed by 310 participants. Sufficient data for multiple regression were provided by 102 and 118 respondents for self-testing and testing by a HCP respectively. Intention to self-test was predicted by vulnerability and self-efficacy, with a trend-level effect for response efficacy. Intention to be tested by a HCP was predicted by vulnerability, attitude and subjective norm. Thus, intentions to carry out two testing behaviours with very similar goals can have different predictors depending on test context. We conclude that interventions to increase self-testing should be based on evidence specifically related to test context.
Testing the Neutral Theory of Biodiversity with Human Microbiome Datasets
Li, Lianwei; Ma, Zhanshan (Sam)
2016-01-01
The human microbiome project (HMP) has made it possible to test important ecological theories for arguably the most important ecosystem to human health?the human microbiome. Existing limited number of studies have reported conflicting evidence in the case of the neutral theory; the present study aims to comprehensively test the neutral theory with extensive HMP datasets covering all five major body sites inhabited by the human microbiome. Utilizing 7437 datasets of bacterial community samples...
The ovenbird (Seiurus aurocapilla) as a model for testing food-value theory
Streby, Henry M.; Peterson, Sean M.; Scholtens, Brian; Monroe, Adrian; Andersen, David
2013-01-01
Food-value theory states that territorial animals space themselves such that each territory contains adequate food for rearing young. The ovenbird (Seiurus aurocapilla) is often cited as a species for which this hypothesis is supported because ovenbird territory size is inversely related to ground-invertebrate abundance within territories. However, little is known about juvenile ovenbird diet and whether food availability is accurately assessed using ground-sampling methods. We examined the relationship between ground-litter food availability and juvenile ovenbird diet in mixed northern hardwood-coniferous forests of north-central Minnesota. We sampled food availability with pitfall traps and litter samples, and concurrently sampled diet of juvenile ovenbirds from stomach samples. We found that juvenile ovenbirds were fed selectively from available food resources. In addition, we found that both ground-sampling methods greatly under-sampled forest caterpillars and snails, which together comprised 63% of juvenile ovenbird diet by mass. Combined with recent radio-telemetry findings that spot-mapping methods can poorly estimate territory size for forest songbirds, our results suggest that comparisons of spot-mapped ovenbird territories with ground-sampled invertebrate availability may not be reliable tests of food-value theory.
How to test the special theory of relativity on rotating earth
International Nuclear Information System (INIS)
Abolghasem, H.; Khadjehpoor, M.R.; Mansouri, R.
1988-02-01
In the framework of a one parameter test theory of special relativity, the difference between Transport- and Einstein synchronization on the rotating earth is calculated. For the special theory of relativity this difference vanishes. Therefore, experiments in which these synchronization procedures are compared, test the special theory of relativity. (author). 8 refs
Doping Among Professional Athletes in Iran: A Test of Akers's Social Learning Theory.
Kabiri, Saeed; Cochran, John K; Stewart, Bernadette J; Sharepour, Mahmoud; Rahmati, Mohammad Mahdi; Shadmanfaat, Syede Massomeh
2018-04-01
The use of performance-enhancing drugs (PED) is common among Iranian professional athletes. As this phenomenon is a social problem, the main purpose of this research is to explain why athletes engage in "doping" activity, using social learning theory. For this purpose, a sample of 589 professional athletes from Rasht, Iran, was used to test assumptions related to social learning theory. The results showed that there are positive and significant relationships between the components of social learning theory (differential association, differential reinforcement, imitation, and definitions) and doping behavior (past, present, and future use of PED). The structural modeling analysis indicated that the components of social learning theory accounts for 36% of the variance in past doping behavior, 35% of the variance in current doping behavior, and 32% of the variance in future use of PED.
Frisch on Testing of Business Cycle Theories
Boumans, M.
1995-01-01
An important identifying assumption for business cycle models is contained in the mathematical form of the model, which determines the nature of its possible movements. Tinbergen's and Frisch's original understanding of business cycle theories was that of a closed model, containing only endogenous
Trifiletti, L B; Gielen, A C; Sleet, D A; Hopkins, K
2005-06-01
Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury prevention research. The authors conducted a systematic review to evaluate the published literature from 1980 to 2001 on behavioral and social science theory applications to unintentional injury prevention and control. Electronic database searches in PubMed and PsycINFO identified articles that combined behavioral and social sciences theories and models and injury causes. The authors identified some articles that examined behavioral and social science theories and models and unintentional injury topics, but found that several important theories have never been applied to unintentional injury prevention. Among the articles identified, the PRECEDE PROCEED Model was cited most frequently, followed by the Theory of Reasoned Action/Theory of Planned Behavior and Health Belief Model. When behavioral and social sciences theories and models were applied to unintentional injury topics, they were most frequently used to guide program design, implementation or develop evaluation measures; few examples of theory testing were found. Results suggest that the use of behavioral and social sciences theories and models in unintentional injury prevention research is only marginally represented in the mainstream, peer-reviewed literature. Both the fields of injury prevention and behavioral and social sciences could benefit from greater collaborative research to enhance behavioral approaches to injury control.
A course on basic model theory
Sarbadhikari, Haimanti
2017-01-01
This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
Complete Model-Based Equivalence Class Testing for the ETCS Ceiling Speed Monitor
DEFF Research Database (Denmark)
Braunstein, Cécile; Haxthausen, Anne Elisabeth; Huang, Wen-ling
2014-01-01
In this paper we present a new test model written in SysML and an associated blackbox test suite for the Ceiling Speed Monitor (CSM) of the European Train Control System (ETCS). The model is publicly available and intended to serve as a novel benchmark for investigating new testing theories...
Testing students' e-learning via Facebook through Bayesian structural equation modeling.
Salarzadeh Jenatabadi, Hashem; Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad
2017-01-01
Learning is an intentional activity, with several factors affecting students' intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods' results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated.
Testing students' e-learning via Facebook through Bayesian structural equation modeling.
Directory of Open Access Journals (Sweden)
Hashem Salarzadeh Jenatabadi
Full Text Available Learning is an intentional activity, with several factors affecting students' intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods' results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated.
Generalization of the test theory of relativity to noninertial frames
International Nuclear Information System (INIS)
Abolghasem, G.H.; Khajehpour, M.R.H.; Mansouri, R.
1988-08-01
We present a generalized test theory of special relativity, using a noninertial frame. Within the framework of the special theory of relativity the transport- and Einstein-synchronizations are equivalent on a rigidly rotating disk. But in any theory with a preferred frame such an equivalence does not hold. The time difference resulting from the two synchronization procedures is a measurable quantity within the reach of existing clock systems on the earth. The final result contains a term which depends on the angular velocity of the rotating system, and hence measures an absolute effect. This term is of crucial importance in our test theory of the special relativity. (author). 13 refs
Students Working Online for Group Projects: A Test of an Extended Theory of Planned Behaviour Model
Cheng, Eddie W. L.
2017-01-01
This study examined an extended theory of planned behaviour (TPB) model that specified factors affecting students' intentions to collaborate online for group work. Past behaviour, past experience and actual behavioural control were incorporated in the extended TPB model. The mediating roles of attitudes, subjective norms and perceived behavioural…
Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons.
Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang
2017-07-06
Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.
Rakkapao, Suttida; Prasitpong, Singha; Arayathanitkul, Kwan
2016-12-01
This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming unidimensionality and local independence. Moreover, all distractors of the TUV were analyzed from item response curves (IRC) that represent simplified IRT. Data were gathered on 2392 science and engineering freshmen, from three universities in Thailand. The results revealed IRT analysis to be useful in assessing the test since its item parameters are independent of the ability parameters. The IRT framework reveals item-level information, and indicates appropriate ability ranges for the test. Moreover, the IRC analysis can be used to assess the effectiveness of the test's distractors. Both IRT and IRC approaches reveal test characteristics beyond those revealed by the classical analysis methods of tests. Test developers can apply these methods to diagnose and evaluate the features of items at various ability levels of test takers.
Model integration and a theory of models
Dolk, Daniel R.; Kottemann, Jeffrey E.
1993-01-01
Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...
Constraint theory multidimensional mathematical model management
Friedman, George J
2017-01-01
Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...
I can do that: the impact of implicit theories on leadership role model effectiveness.
Hoyt, Crystal L; Burnette, Jeni L; Innella, Audrey N
2012-02-01
This research investigates the role of implicit theories in influencing the effectiveness of successful role models in the leadership domain. Across two studies, the authors test the prediction that incremental theorists ("leaders are made") compared to entity theorists ("leaders are born") will respond more positively to being presented with a role model before undertaking a leadership task. In Study 1, measuring people's naturally occurring implicit theories of leadership, the authors showed that after being primed with a role model, incremental theorists reported greater leadership confidence and less anxious-depressed affect than entity theorists following the leadership task. In Study 2, the authors demonstrated the causal role of implicit theories by manipulating participants' theory of leadership ability. They replicated the findings from Study 1 and demonstrated that identification with the role model mediated the relationship between implicit theories and both confidence and affect. In addition, incremental theorists outperformed entity theorists on the leadership task.
Determinants of choice of delivery place: Testing rational choice theory and habitus theory.
Broda, Anja; Krüger, Juliane; Schinke, Stephanie; Weber, Andreas
2018-05-07
The current study uses two antipodal social science theories, the rational choice theory and the habitus theory, and applies these to describe how women choose between intraclinical (i.e., hospital-run birth clinics) and extraclinical (i.e., midwife-led birth centres or home births) delivery places. Data were collected in a cross-sectional questionnaire-based survey among 189 women. A list of 22 determinants, conceptualized to capture the two theoretical concepts, were rated on a 7-point Likert scale with 1 = unimportant to 7 = very important. The analytic method was structural equation modelling. A model was built, in which the rational choice theory and the habitus theory as latent variables predicted the choice of delivery place. With regards to the choice of delivery place, 89.3% of the women wanted an intraclinical and 10.7% an extraclinical delivery place at the time of their last child's birth. Significant differences between women with a choice of an intraclinical or extraclinical delivery place were found for 14 of the 22 determinants. In the structural equation model, rational choice theory determinants predicted a choice of intraclinical delivery and habitus theory determinants predicted a choice of extraclinical delivery. The two theories had diametrically opposed effects on the choice of delivery place. Women are more likely to decide on intraclinical delivery when arguments such as high medical standards, positive evaluations, or good advanced information are rated important. In contrast, women are more likely to decide on extraclinical delivery when factors such as family atmosphere during birth, friendliness of health care professionals, or consideration of the woman's interests are deemed important. A practical implication of our study is that intraclinical deliveries may be promoted by providing comprehensive information, data and facts on various delivery-related issues, while extraclinical deliveries may be fostered by healthcare
Neutron polarimetric test of Leggett's contextual model of quantum mechanics
International Nuclear Information System (INIS)
Schmitzer, C.; Bartosik, H.; Klepp, J.; Sponar, S.; Badurek, G.; Hasegawa, J.
2009-01-01
Full text: The Einstein-Podolsky-Rosen (EPR) argument attempted to dispute quantum theory. With the Bell inequality it was possible to set up an experimental test of the EPR argument. Here, we describe the rebuilding of the measurement station at the tangential beam exit of the TRIGA reactor of the Atominstitut in Vienna. A new polarimeter setup was constructed and adjusted to generate Bell states by entangling a neutron's energy and spin. After accomplishing visibilities of up to 98.7 %, it was possible to test a Leggett-type inequality, which challenges a 'contextual' hidden variable theory. Such a contextual model would have been capable of reproducing former Bell inequality violations. Measurement results of this Leggett inequality and a generalized Clauser-Horne-Shimony-Holt (CHSH) inequality show violations of this hidden variable model. Hence noncontextual and contextual hidden variable theories can be excluded simultaneously and quantum mechanical predictions are confirmed. (author)
Applications of model theory to functional analysis
Iovino, Jose
2014-01-01
During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the
A model of the precaution adoption process: evidence from home radon testing
International Nuclear Information System (INIS)
Weinstein, N.D.; Sandman, P.M.
1992-01-01
The authors present the precaution adoption process model--a stage theory consisting of seven distinct states between ignorance and completed preventive action. The stages are 'unaware of the issue,' 'aware of the issue but not personally engaged,' 'engaged and deciding what to do,' 'planning to act but not yet having acted,' 'having decided not to act,' 'acting,' and 'maintenance.' The theory asserts that these stages represent qualitatively different patterns of behavior, beliefs, and experience and that the factors that produce transitions between stages vary depending on the specific transition being considered. Data from seven studies of home radon testing are examined to test some of the claims made by this model. Stage theories of protective behavior are contrasted with theories that see precaution adoption in terms of movement along a single continuum of action likelihood.32 references
Directory of Open Access Journals (Sweden)
Suttida Rakkapao
2016-10-01
Full Text Available This study investigated the multiple-choice test of understanding of vectors (TUV, by applying item response theory (IRT. The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming unidimensionality and local independence. Moreover, all distractors of the TUV were analyzed from item response curves (IRC that represent simplified IRT. Data were gathered on 2392 science and engineering freshmen, from three universities in Thailand. The results revealed IRT analysis to be useful in assessing the test since its item parameters are independent of the ability parameters. The IRT framework reveals item-level information, and indicates appropriate ability ranges for the test. Moreover, the IRC analysis can be used to assess the effectiveness of the test’s distractors. Both IRT and IRC approaches reveal test characteristics beyond those revealed by the classical analysis methods of tests. Test developers can apply these methods to diagnose and evaluate the features of items at various ability levels of test takers.
Malloch, Douglas C.; Michael, William B.
1981-01-01
This study was designed to determine whether an unweighted linear combination of community college students' scores on standardized achievement tests and a measure of motivational constructs derived from Vroom's expectance theory model of motivation was predictive of academic success (grade point average earned during one quarter of an academic…
Energy Technology Data Exchange (ETDEWEB)
Cruz-Dombriz, Álvaro de la; Dunsby, Peter K.S.; Luongo, Orlando; Reverberi, Lorenzo, E-mail: alvaro.delacruzdombriz@uct.ac.za, E-mail: peter.dunsby@uct.ac.za, E-mail: luongo@na.infn.it, E-mail: lorenzo.reverberi@uct.ac.za [Department of Mathematics and Applied Mathematics, University of Cape Town, Rondebosch 7701, Cape Town (South Africa)
2016-12-01
The onset of dark energy domination depends on the particular gravitational theory driving the cosmic evolution. Model independent techniques are crucial to test the both the present ΛCDM cosmological paradigm and alternative theories, making the least possible number of assumptions about the Universe. In this paper we investigate whether cosmography is able to distinguish between different gravitational theories, by determining bounds on model parameters for three different extensions of General Relativity, namely quintessence, F (Τ) and f ( R ) gravitational theories. We expand each class of theories in powers of redshift z around the present time, making no additional assumptions. This procedure is an extension of previous work and can be seen as the most general approach for testing extended theories of gravity through the use of cosmography. In the case of F (Τ) and f ( R ) theories, we show that some assumptions on model parameters often made in previous works are superfluous or even unjustified. We use data from the Union 2.1 supernovae catalogue, baryonic acoustic oscillation data and H ( z ) differential age compilations, which probe cosmology on different scales of the cosmological evolution. We perform a Monte Carlo analysis using a Metropolis-Hastings algorithm with a Gelman-Rubin convergence criterion, reporting 1-σ and 2-σ confidence levels. To do so, we perform two distinct fits, assuming only data within z < 1 first and then without limitations in redshift. We obtain the corresponding numerical intervals in which coefficients span, and find that the data is compatible the ΛCDM limit of all three theories at the 1-σ level, while still compatible with quite a large portion of parameter space. We compare our results to the truncated ΛCDM paradigm, demonstrating that our bounds divert from the expectations of previous works, showing that the permitted regions of coefficients are significantly modified and in general widened with respect to
Hullett, Craig R
2006-01-01
This study tests the utility of the functional theory of attitudes and arousal of fear in motivating college students to get tested for HIV. It is argued from the perspective of functional theory that value-expressive appeals to get tested for the purpose of taking care of one's own health could be effective if that goal is desired by message targets who are sexually active and unaware of their sexually transmitted disease status. As part of the process, the effectiveness of these appeals is increased by the arousal of uncertainty and fear. A model detailing the mediating processes is proposed and found to be consistent with the data. Overall, messages advocating testing for the self-interested reason of one's own health were more effective than messages advocating testing for the goal of protecting one's partners.
Testing and inference in nonlinear cointegrating vector error correction models
DEFF Research Database (Denmark)
Kristensen, D.; Rahbek, A.
2013-01-01
We analyze estimators and tests for a general class of vector error correction models that allows for asymmetric and nonlinear error correction. For a given number of cointegration relationships, general hypothesis testing is considered, where testing for linearity is of particular interest. Under...... the null of linearity, parameters of nonlinear components vanish, leading to a nonstandard testing problem. We apply so-called sup-tests to resolve this issue, which requires development of new(uniform) functional central limit theory and results for convergence of stochastic integrals. We provide a full...... asymptotic theory for estimators and test statistics. The derived asymptotic results prove to be nonstandard compared to results found elsewhere in the literature due to the impact of the estimated cointegration relations. This complicates implementation of tests motivating the introduction of bootstrap...
System Dynamics as Model-Based Theory Building
Schwaninger, Markus; Grösser, Stefan N.
2008-01-01
This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...
Five roles for using theory and evidence in the design and testing of behavior change interventions.
Bartholomew, L Kay; Mullen, Patricia Dolan
2011-01-01
The prevailing wisdom in the field of health-related behavior change is that well-designed and effective interventions are guided by theory. Using the framework of intervention mapping, we describe and provide examples of how investigators can effectively select and use theory to design, test, and report interventions. We propose five roles for theory and evidence about theories: a) identification of behavior and determinants of behavior related to a specified health problem (i.e., the logic model of the problem); b) explication of a causal model that includes theoretical constructs for producing change in the behavior of interest (i.e., the logic model of change); c) selection of intervention methods and delivery of practical applications to achieve changes in health behavior; d) evaluation of the resulting intervention including theoretical mediating variables; and e) reporting of the active ingredients of the intervention together with the evaluation results. In problem-driven applied behavioral or social science, researchers use one or multiple theories, empiric evidence, and new research, both to assess a problem and to solve or prevent a problem. Furthermore, the theories for description of the problem may differ from the theories for its solution. In an applied approach, the main focus is on solving problems regarding health behavior change and improvement of health outcomes, and the criteria for success are formulated in terms of the problem rather than the theory. Resulting contributions to theory development may be quite useful, but they are peripheral to the problem-solving process.
Conklin, Amanda M.; Dahling, Jason J.; Garcia, Pablo A.
2013-01-01
The authors tested a model based on the satisfaction model of social cognitive career theory (SCCT) that links college students' affective commitment to their major (the emotional identification that students feel toward their area of study) with career decision self-efficacy (CDSE) and career outcome expectations. Results indicate that CDSE…
Dual-Process Theories of Reasoning: The Test of Development
Barrouillet, Pierre
2011-01-01
Dual-process theories have become increasingly influential in the psychology of reasoning. Though the distinction they introduced between intuitive and reflective thinking should have strong developmental implications, the developmental approach has rarely been used to refine or test these theories. In this article, I review several contemporary…
Arantes, Joana; Machado, Armando
2008-07-01
Pigeons were trained on two temporal bisection tasks, which alternated every two sessions. In the first task, they learned to choose a red key after a 1-s signal and a green key after a 4-s signal; in the second task, they learned to choose a blue key after a 4-s signal and a yellow key after a 16-s signal. Then the pigeons were exposed to a series of test trials in order to contrast two timing models, Learning-to-Time (LeT) and Scalar Expectancy Theory (SET). The models made substantially different predictions particularly for the test trials in which the sample duration ranged from 1 s to 16 s and the choice keys were Green and Blue, the keys associated with the same 4-s samples: LeT predicted that preference for Green should increase with sample duration, a context effect, but SET predicted that preference for Green should not vary with sample duration. The results were consistent with LeT. The present study adds to the literature the finding that the context effect occurs even when the two basic discriminations are never combined in the same session.
Proposed experimental test of the theory of hole superconductivity
Energy Technology Data Exchange (ETDEWEB)
Hirsch, J.E., E-mail: jhirsch@ucsd.edu
2016-06-15
Highlights: • The conventional theory of superconductivity predicts no charge flow when the normal-superconductor phase boundary moves. • The theory of hole superconductivity predicts flow and counterflow of charge. • An experiment to measure a voltage is proposed. • No voltage will be measured if the conventional theory is correct. • A voltage will be measured if the theory of hole superconductivity is correct. - Abstract: The theory of hole superconductivity predicts that in the reversible transition between normal and superconducting phases in the presence of a magnetic field there is charge flow in direction perpendicular to the normal-superconductor phase boundary. In contrast, the conventional BCS-London theory of superconductivity predicts no such charge flow. Here we discuss an experiment to test these predictions.
Test theory of special relativity: What it is and why we need it
International Nuclear Information System (INIS)
Mansouri, R.
1988-03-01
After a critical overview on the traditional way of expressing the accuracy of experiments testing the postulates of the special theory of relativity, the four-parameter test theory is briefly introduced. The existing experiments are then classified and their accuracies are expressed in terms of the parameter of the test theory. By changing the convention of synchronization of distant clocks, it is shown how different equivalent theories can be formulated. (author). 23 refs
International Nuclear Information System (INIS)
Schlingemann, D.
1996-10-01
Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)
Penningroth, Suzanna L.; Scott, Walter D.
2012-01-01
Two prominent theories of lifespan development, socioemotional selectivity theory and selection, optimization, and compensation theory, make similar predictions for differences in the goal representations of younger and older adults. Our purpose was to test whether the goals of younger and older adults differed in ways predicted by these two…
Klaassen, Ger; Nentjes, Andries; Smith, Mark
2005-01-01
Simulation models and theory prove that emission trading converges to market equilibrium. This paper sets out to test these results using experimental economics. Three experiments are conducted for the six largest carbon emitting industrialized regions. Two experiments use auctions, the first a
Morrison, Diane M; Golder, Seana; Keller, Thomas E; Gillmore, Mary Rogers
2002-09-01
The theory of reasoned action (TRA) is used to model decisions about substance use among young mothers who became premaritally pregnant at age 17 or younger. The results of structural equation modeling to test the TRA indicated that most relationships specified by the model were significant and in the predicted direction. Attitude was a stronger predictor of intention than norm, but both were significantly related to intention, and intention was related to actual marijuana use 6 months later. Outcome beliefs were bidimensional, and positive outcome beliefs, but not negative beliefs, were significantly related to attitude. Prior marijuana use was only partially mediated by the TRA variables; it also was directly related to intentions to use marijuana and to subsequent use.
Stamovlasis, Dimitrios; Tsaparlis, Georgios
2012-01-01
In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…
International Nuclear Information System (INIS)
Wang, Lin; Liu, Xiongwei; Renevier, Nathalie; Stables, Matthew; Hall, George M.
2014-01-01
Due to the increasing size and flexibility of large wind turbine blades, accurate and reliable aeroelastic modelling is playing an important role for the design of large wind turbines. Most existing aeroelastic models are linear models based on assumption of small blade deflections. This assumption is not valid anymore for very flexible blade design because such blades often experience large deflections. In this paper, a novel nonlinear aeroelastic model for large wind turbine blades has been developed by combining BEM (blade element momentum) theory and mixed-form formulation of GEBT (geometrically exact beam theory). The nonlinear aeroelastic model takes account of large blade deflections and thus greatly improves the accuracy of aeroelastic analysis of wind turbine blades. The nonlinear aeroelastic model is implemented in COMSOL Multiphysics and validated with a series of benchmark calculation tests. The results show that good agreement is achieved when compared with experimental data, and its capability of handling large deflections is demonstrated. Finally the nonlinear aeroelastic model is applied to aeroelastic modelling of the parked WindPACT 1.5 MW baseline wind turbine, and reduced flapwise deflection from the nonlinear aeroelastic model is observed compared to the linear aeroelastic code FAST (Fatigue, Aerodynamics, Structures, and Turbulence). - Highlights: • A novel nonlinear aeroelastic model for wind turbine blades is developed. • The model takes account of large blade deflections and geometric nonlinearities. • The model is reliable and efficient for aeroelastic modelling of wind turbine blades. • The accuracy of the model is verified by a series of benchmark calculation tests. • The model provides more realistic aeroelastic modelling than FAST (Fatigue, Aerodynamics, Structures, and Turbulence)
Analysis of North Korea's Nuclear Tests under Prospect Theory
International Nuclear Information System (INIS)
Lee, Han Myung; Ryu, Jae Soo; Lee, Kwang Seok; Lee, Dong Hoon; Jun, Eunju; Kim, Mi Jin
2013-01-01
North Korea has chosen nuclear weapons as the means to protect its sovereignty. Despite international society's endeavors and sanctions to encourage North Korea to abandon its nuclear ambition, North Korea has repeatedly conducted nuclear testing. In this paper, the reason for North Korea's addiction to a nuclear arsenal is addressed within the framework of cognitive psychology. The prospect theory addresses an epistemological approach usually overlooked in rational choice theories. It provides useful implications why North Korea, being under a crisis situation has thrown out a stable choice but taken on a risky one such as nuclear testing. Under the viewpoint of prospect theory, nuclear tests by North Korea can be understood as follows: The first nuclear test in 2006 is seen as a trial to escape from loss areas such as financial sanctions and regime threats; the second test in 2009 was interpreted as a consequence of the strategy to recover losses by making a direct confrontation against the United States; and the third test in 2013 was understood as an attempt to strengthen internal solidarity after Kim Jong-eun inherited the dynasty, as well as to enhance bargaining power against the United States. Thus, it can be summarized that Pyongyang repeated its nuclear tests to escape from a negative domain and to settle into a positive one. In addition, in the future, North Korea may not be willing to readily give up its nuclear capabilities to ensure the survival of its own regime
Testing simulation and structural models with applications to energy demand
Wolff, Hendrik
2007-12-01
This dissertation deals with energy demand and consists of two parts. Part one proposes a unified econometric framework for modeling energy demand and examples illustrate the benefits of the technique by estimating the elasticity of substitution between energy and capital. Part two assesses the energy conservation policy of Daylight Saving Time and empirically tests the performance of electricity simulation. In particular, the chapter "Imposing Monotonicity and Curvature on Flexible Functional Forms" proposes an estimator for inference using structural models derived from economic theory. This is motivated by the fact that in many areas of economic analysis theory restricts the shape as well as other characteristics of functions used to represent economic constructs. Specific contributions are (a) to increase the computational speed and tractability of imposing regularity conditions, (b) to provide regularity preserving point estimates, (c) to avoid biases existent in previous applications, and (d) to illustrate the benefits of our approach via numerical simulation results. The chapter "Can We Close the Gap between the Empirical Model and Economic Theory" discusses the more fundamental question of whether the imposition of a particular theory to a dataset is justified. I propose a hypothesis test to examine whether the estimated empirical model is consistent with the assumed economic theory. Although the proposed methodology could be applied to a wide set of economic models, this is particularly relevant for estimating policy parameters that affect energy markets. This is demonstrated by estimating the Slutsky matrix and the elasticity of substitution between energy and capital, which are crucial parameters used in computable general equilibrium models analyzing energy demand and the impacts of environmental regulations. Using the Berndt and Wood dataset, I find that capital and energy are complements and that the data are significantly consistent with duality
Toric Methods in F-Theory Model Building
Directory of Open Access Journals (Sweden)
Johanna Knapp
2011-01-01
Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.
Non-linear σ-models and string theories
International Nuclear Information System (INIS)
Sen, A.
1986-10-01
The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs
Validity Theory: Reform Policies, Accountability Testing, and Consequences
Chalhoub-Deville, Micheline
2016-01-01
Educational policies such as Race to the Top in the USA affirm a central role for testing systems in government-driven reform efforts. Such reform policies are often referred to as the global education reform movement (GERM). Changes observed with the GERM style of testing demand socially engaged validity theories that include consequential…
Standard Model theory calculations and experimental tests
International Nuclear Information System (INIS)
Cacciari, M.; Hamel de Monchenault, G.
2015-01-01
To present knowledge, all the physics at the Large Hadron Collider (LHC) can be described in the framework of the Standard Model (SM) of particle physics. Indeed the newly discovered Higgs boson with a mass close to 125 GeV seems to confirm the predictions of the SM. Thus, besides looking for direct manifestations of the physics beyond the SM, one of the primary missions of the LHC is to perform ever more stringent tests of the SM. This requires not only improved theoretical developments to produce testable predictions and provide experiments with reliable event generators, but also sophisticated analyses techniques to overcome the formidable experimental environment of the LHC and perform precision measurements. In the first section, we describe the state of the art of the theoretical tools and event generators that are used to provide predictions for the production cross sections of the processes of interest. In section 2, inclusive cross section measurements with jets, leptons and vector bosons are presented. Examples of differential cross sections, charge asymmetries and the study of lepton pairs are proposed in section 3. Finally, in section 4, we report studies on the multiple production of gauge bosons and constraints on anomalous gauge couplings
Bayes Factor Covariance Testing in Item Response Models.
Fox, Jean-Paul; Mulder, Joris; Sinharay, Sandip
2017-12-01
Two marginal one-parameter item response theory models are introduced, by integrating out the latent variable or random item parameter. It is shown that both marginal response models are multivariate (probit) models with a compound symmetry covariance structure. Several common hypotheses concerning the underlying covariance structure are evaluated using (fractional) Bayes factor tests. The support for a unidimensional factor (i.e., assumption of local independence) and differential item functioning are evaluated by testing the covariance components. The posterior distribution of common covariance components is obtained in closed form by transforming latent responses with an orthogonal (Helmert) matrix. This posterior distribution is defined as a shifted-inverse-gamma, thereby introducing a default prior and a balanced prior distribution. Based on that, an MCMC algorithm is described to estimate all model parameters and to compute (fractional) Bayes factor tests. Simulation studies are used to show that the (fractional) Bayes factor tests have good properties for testing the underlying covariance structure of binary response data. The method is illustrated with two real data studies.
Generalizability Theory and Classical Test Theory
Brennan, Robert L.
2011-01-01
Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…
Lectures on algebraic model theory
Hart, Bradd
2001-01-01
In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.
Threshold Theory Tested in an Organizational Setting
DEFF Research Database (Denmark)
Christensen, Bo T.; Hartmann, Peter V. W.; Hedegaard Rasmussen, Thomas
2017-01-01
A large sample of leaders (N = 4257) was used to test the link between leader innovativeness and intelligence. The threshold theory of the link between creativity and intelligence assumes that below a certain IQ level (approximately IQ 120), there is some correlation between IQ and creative...... potential, but above this cutoff point, there is no correlation. Support for the threshold theory of creativity was found, in that the correlation between IQ and innovativeness was positive and significant below a cutoff point of IQ 120. Above the cutoff, no significant relation was identified, and the two...... correlations differed significantly. The finding was stable across distinct parts of the sample, providing support for the theory, although the correlations in all subsamples were small. The findings lend support to the existence of threshold effects using perceptual measures of behavior in real...
Internal Universes in Models of Homotopy Type Theory
DEFF Research Database (Denmark)
Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.
2018-01-01
We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
Testing ‘cultural reproduction theory’ against relative risk aversion theory – some remarks
DEFF Research Database (Denmark)
Munk, Martin David; Jakobsen, Anders
2015-01-01
of the concept of habitus. Our point is that blinding out the important concept of habitus amputates the theory, and that a test built upon this limitation is not a test of Bourdieu’s theory as a whole, but rather a straw man construction ignoring important parts of the theory. This has strong implications when......The aim of this research note is to discuss inherent limitations in certain established, but problematic, conventions for operationalizing and testing Pierre Bourdieu’s theory of cultural reproduction. These conventions entail a selective focus on the concept of capital at the expense...... seeking to test statistically the viability of Bourdieu’s theory, particularly vis-a-vis rational choice alternatives, and especially where these limitations are not adequately reflected in the interpretation of results and in conclusions....
Chern-Simons matrix models, two-dimensional Yang-Mills theory and the Sutherland model
International Nuclear Information System (INIS)
Szabo, Richard J; Tierz, Miguel
2010-01-01
We derive some new relationships between matrix models of Chern-Simons gauge theory and of two-dimensional Yang-Mills theory. We show that q-integration of the Stieltjes-Wigert matrix model is the discrete matrix model that describes q-deformed Yang-Mills theory on S 2 . We demonstrate that the semiclassical limit of the Chern-Simons matrix model is equivalent to the Gross-Witten model in the weak-coupling phase. We study the strong-coupling limit of the unitary Chern-Simons matrix model and show that it too induces the Gross-Witten model, but as a first-order deformation of Dyson's circular ensemble. We show that the Sutherland model is intimately related to Chern-Simons gauge theory on S 3 , and hence to q-deformed Yang-Mills theory on S 2 . In particular, the ground-state wavefunction of the Sutherland model in its classical equilibrium configuration describes the Chern-Simons free energy. The correspondence is extended to Wilson line observables and to arbitrary simply laced gauge groups.
A Realizability Model for Impredicative Hoare Type Theory
DEFF Research Database (Denmark)
Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar
2008-01-01
We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....
A test of the theory of nonrenewable resources. Controlling for exploration and market power
Energy Technology Data Exchange (ETDEWEB)
Malischek, Raimund [Koeln Univ. (Germany). Inst. of Energy Economics; Tode, Christian [Koeln Univ. (Germany). Inst. of Energy Economics; Koeln Univ. (Germany). Dept. of Economics
2015-05-15
Despite the central role of the Hotelling model within the theory of nonrenewable resources, tests of the model are rarely found. If existent, these tests tend to ignore two key features, namely market power and exploration. We therefore suggest an extension of the basic Hotelling framework to incorporate exploration activity and market power and propose an implicit price behavior test of the model to indicate whether firms undergo inter-temporal optimization. When applied to a newly constructed data set for the uranium mining industry, the null hypothesis of the firm optimizing inter-temporally is rejected in all settings. However, parameter estimates of the model still yield valuable information on cost structure, resource scarcity and market power. Our results suggest that the shadow price of the resource in situ is comparably small and may be overshadowed by market power, which may serve as an explanation for the firm failing to optimize inter-temporally.
A test of the theory of nonrenewable resources. Controlling for exploration and market power
International Nuclear Information System (INIS)
Malischek, Raimund; Tode, Christian; Koeln Univ.
2015-01-01
Despite the central role of the Hotelling model within the theory of nonrenewable resources, tests of the model are rarely found. If existent, these tests tend to ignore two key features, namely market power and exploration. We therefore suggest an extension of the basic Hotelling framework to incorporate exploration activity and market power and propose an implicit price behavior test of the model to indicate whether firms undergo inter-temporal optimization. When applied to a newly constructed data set for the uranium mining industry, the null hypothesis of the firm optimizing inter-temporally is rejected in all settings. However, parameter estimates of the model still yield valuable information on cost structure, resource scarcity and market power. Our results suggest that the shadow price of the resource in situ is comparably small and may be overshadowed by market power, which may serve as an explanation for the firm failing to optimize inter-temporally.
Halo modelling in chameleon theories
Energy Technology Data Exchange (ETDEWEB)
Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)
2014-03-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.
Halo modelling in chameleon theories
International Nuclear Information System (INIS)
Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu
2014-01-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations
Theories, Models and Methodology in Writing Research
Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel
1996-01-01
Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the
Application of Item Response Theory to Tests of Substance-related Associative Memory
Shono, Yusuke; Grenard, Jerry L.; Ames, Susan L.; Stacy, Alan W.
2015-01-01
A substance-related word association test (WAT) is one of the commonly used indirect tests of substance-related implicit associative memory and has been shown to predict substance use. This study applied an item response theory (IRT) modeling approach to evaluate psychometric properties of the alcohol- and marijuana-related WATs and their items among 775 ethnically diverse at-risk adolescents. After examining the IRT assumptions, item fit, and differential item functioning (DIF) across gender and age groups, the original 18 WAT items were reduced to 14- and 15-items in the alcohol- and marijuana-related WAT, respectively. Thereafter, unidimensional one- and two-parameter logistic models (1PL and 2PL models) were fitted to the revised WAT items. The results demonstrated that both alcohol- and marijuana-related WATs have good psychometric properties. These results were discussed in light of the framework of a unified concept of construct validity (Messick, 1975, 1989, 1995). PMID:25134051
Narrative theories as computational models: reader-oriented theory and artificial intelligence
Energy Technology Data Exchange (ETDEWEB)
Galloway, P.
1983-12-01
In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.
Directory of Open Access Journals (Sweden)
Alexandra Isabel Cabral da Silva Gomes
2018-01-01
Full Text Available ABSTRACT It was our goal to give a contribution to the prediction of condom use using socio-cognitive models, comparing classic theories to an extended model. A cross-sectional study was conducted using a questionnaire of self-reported measures. From the students who agreed to participate in the study, 140 were eligible for the full study. A confirmatory analysis was used to assess the predictive value of the researched model. The model tested had slightly better fit indexes and predictive value than classic Theories of Reasoned Action and Planned Behaviour. Although the results found, discussion continues to understand the gap between intention and behaviour, as further investigation is necessary to fully understand the reasons for condom use inconsistency.
Situated learning theory: adding rate and complexity effects via Kauffman's NK model.
Yuan, Yu; McKelvey, Bill
2004-01-01
For many firms, producing information, knowledge, and enhancing learning capability have become the primary basis of competitive advantage. A review of organizational learning theory identifies two approaches: (1) those that treat symbolic information processing as fundamental to learning, and (2) those that view the situated nature of cognition as fundamental. After noting that the former is inadequate because it focuses primarily on behavioral and cognitive aspects of individual learning, this paper argues the importance of studying learning as interactions among people in the context of their environment. It contributes to organizational learning in three ways. First, it argues that situated learning theory is to be preferred over traditional behavioral and cognitive learning theories, because it treats organizations as complex adaptive systems rather than mere information processors. Second, it adds rate and nonlinear learning effects. Third, following model-centered epistemology, it uses an agent-based computational model, in particular a "humanized" version of Kauffman's NK model, to study the situated nature of learning. Using simulation results, we test eight hypotheses extending situated learning theory in new directions. The paper ends with a discussion of possible extensions of the current study to better address key issues in situated learning.
Liou, Shwu-Ru
2009-01-01
To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.
Generalized algebra-valued models of set theory
Löwe, B.; Tarafder, S.
2015-01-01
We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.
van der Linden, Willem J.
1995-01-01
Dichotomous item response theory (IRT) models can be viewed as families of stochastically ordered distributions of responses to test items. This paper explores several properties of such distributiom. The focus is on the conditions under which stochastic order in families of conditional
Modelling welded material for ultrasonic testing using MINA: Theory and applications
Moysan, J.; Corneloup, G.; Chassignole, B.; Gueudré, C.; Ploix, M. A.
2012-05-01
Austenitic steel multi-pass welds exhibit a heterogeneous and anisotropic structure that causes difficulties in the ultrasonic testing. Increasing the material knowledge is a long term research field for LCND laboratory and EDF Les Renardières in France. A specific model has been developed: the MINA model (Modelling an Isotropy from Notebook of Arc welding). Welded material is described in 2D for flat position arc welding with shielded electrode (SMAW) at a functional scale for UT modeling. The grain growth is the result of three physical phenomena: epitaxial growth, influence of temperature gradient, and competition between the grains. The model uses phenomenological rules to combine these three phenomena. A limited number of parameters is used to make the modelling possible from the information written down in a notebook of arc welding. We present all these principles with 10 years' hindsight. To illustrate the model's use, we present conclusions obtained with two recent applications. In conclusion we give also insights on other research topics around this model : inverse problem using a F.E.M. code simulating the ultrasonic propagation, in position welding, 3D prospects, GTAW.
Graphical Model Theory for Wireless Sensor Networks
International Nuclear Information System (INIS)
Davis, William B.
2002-01-01
Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm
Seigniorage Revenue and Inflation Tax: Testing Optimal Seigniorage Theory for Turkish Economy
dogru, bulent
2013-01-01
The goal of this study is to test the implication of Mankiv’s (1987) optimal seigniorage theory suggesting that in the long run higher tax rates are associated with higher inflation rates and higher nominal interest rates for Turkish Economy using time series dataset for the time period 1980-2011.We examine the long run relationship between nominal interest rates, inflation and tax revenue. For this purpose, we estimate the Mankiw’soptimal seigniorage model for Turkish Economy with t...
'Distorted structure modelling' - a more physical approach to Rapid Distortion Theory
International Nuclear Information System (INIS)
Savill, A.M.
1979-11-01
Rapid Distortion Theory is reviewed in the light of the modern mechanistic approach to turbulent motion. The apparent failure of current models, based on this theory, to predict stress intensity ratios accurately in distorted shear flows is attributed to their oversimplistic assumptions concerning the inherent turbulence structure of such flows. A more realistic picture of this structure and the manner in which it responds to distortion is presented in terms of interactions between the mean flow and three principal types of eddies. If Rapid Distortion Theory is modified to account for this it is shown that the stress intensity ratios can be accurately predicted in three test flows. It is concluded that a computational scheme based on Rapid Distortion Theory might ultimately be capable of predicting turbulence parameters in the highly complex geometries of reactor cooling systems. (author)
Yield surface investigation of alloys during model disk spin tests
Directory of Open Access Journals (Sweden)
E. P. Kuzmin
2014-01-01
Full Text Available Gas-turbine engines operate under heavy subsequently static loading conditions. Disks of gas-turbine engine are high loaded parts of irregular shape having intensive stress concentrators wherein a 3D stress strain state occurs. The loss of load-carrying capability or burst of disk can lead to severe accident or disaster. Therefore, development of methods to assess deformations and to predict burst is one of the most important problems.Strength assessment approaches are used at all levels of engine creation. In recent years due to actively developing numerical method, particularly FEA, it became possible to investigate load-carrying capability of irregular shape disks, to use 3D computing schemes including flow theory and different options of force and deformation failure criteria. In spite of a wide progress and practical use of strength assessment approaches, there is a lack of detailed research data on yield surface of disk alloys. The main purpose of this work is to validate the use of basis hypothesis of flow theory and investigate the yield surface of disk alloys during the disks spin test.The results of quasi-static numerical simulation of spin tests of model disk made from high-temperature forged alloy are presented. To determine stress-strain state of disk during loading finite element analysis is used. Simulation of elastic-plastic strain fields was carried out using incremental theory of plasticity with isotropic hardening. Hardening function was taken from the results of specimens tensile test. Specimens were cut from a sinkhead of model disk. The paper investigates the model sensitivity affected by V.Mises and Tresca yield criteria as well as the Hosford model. To identify the material model parameters the eddy current sensors were used in the experimental approach to measure rim radial displacements during the load-unload of spin test. The results of calculation made using different material models were compared with the
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Lockman, Jennifer D; Servaty-Seib, Heather L
2018-04-01
There is a lack of empirically supported theories explaining suicidal ideation and few theories describe how suicidal ideation can be prevented in the context of normative human development. Rogers (2001) proposed an existential constructivist theory of suicide (ECTS) wherein existential distress and the inability to reconstruct meaning from adverse life events contribute to suicidal ideation. The ECTS includes a distinct focus on meaning reconstruction from adverse life events, which is congruent with existing research on college students and developmental frameworks used by counseling psychologists. Thus, in the present study, we tested the predictions of the ECTS in a college student sample. We collected data online from 195 college students (i.e., ages 18-25) attending a large, Midwestern university and analyzed the data using structural equation modeling. Findings provided partial support for the original ECTS. Post hoc analyses of an alternate ECTS model indicated that existential distress mediated the negative association between meaning reconstruction and suicidal ideation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Towards a Theory for Testing Non-terminating Programs
DEFF Research Database (Denmark)
Gotlieb, Arnaud; Petit, Matthieu
2009-01-01
Non-terminating programs are programs that legally perform unbounded computations. Though they are ubiquitous in real-world applications, testing these programs requires new theoretic developments as usual definitions of test data adequacy criteria ignore infinite paths. This paper develops...... a theory of program-based structural testing based on operational semantics. Reasoning at the program semantics level permits to cope with infinite paths (and non-feasible paths) when defining test data adequacy criteria. As a result, our criteria respect the first Weyuker’s property on finite...... applicability, even for non-terminating programs. We discuss the consequences of this re-interpretation of test data adequacy criteria w.r.t. existing test coverage criteria....
Vibration tests and analyses of the reactor building model on a small scale
International Nuclear Information System (INIS)
Tsuchiya, Hideo; Tanaka, Mitsuru; Ogihara, Yukio; Moriyama, Ken-ichi; Nakayama, Masaaki
1985-01-01
The purpose of this paper is to describe the vibration tests and the simulation analyses of the reactor building model on a small scale. The model vibration tests were performed to investigate the vibrational characteristics of the combined super-structure and to verify the computor code based on Dr. H. Tajimi's Thin Layered Element Theory, using the uniaxial shaking table (60 cm x 60 cm). The specimens consist of ground model, three structural model (prestressed concrete containment vessel, inner concrete structure, and enclosure building), a combined structural model and a combined structure-soil interaction model. These models are made of silicon-rubber, and they have a scale of 1:600. Harmonic step by step excitation of 40 gals was performed to investigate the vibrational characteristics for each structural model. The responses of the specimen to harmonic excitation were measured by optical displacement meters, and analyzed by a real time spectrum analyzer. The resonance and phase lag curves of the specimens to the shaking table were obtained respectively. As for the tests of a combined structure-soil interaction model, three predominant frequencies were observed in the resonance curves. These values were in good agreement with the analytical transfer function curves on the computer code. From the vibration tests and the simulation analyses, the silicon-rubber model test is useful for the fundamental study of structural problems. The computer code based on the Thin Element Theory can simulate well the test results. (Kobozono, M.)
Testing the Grandchildren's Received Affection Scale using Affection Exchange Theory.
Mansson, Daniel H
2013-04-01
The purpose of this study was to test the Grandchildren's Received Affection Scale (GRAS) using Affection Exchange Theory (Floyd, 2006). In accordance with Affection Exchange Theory, it was hypothesized that grandchildren's scores on the Trait Affection Received Scale (i.e., the extent to which individuals by nature receive affection) would be related significantly and positively to their reports of received affection from their grandparents (i.e., their scores on the GRAS). Additionally, a research question was asked to explore if grandchildren's received affection from their grandparents is dependent on their grandparent's biological sex or lineage (i.e., maternal vs paternal). Thus, young adult grandchildren (N = 422) completed the GRAS and the Trait Affection Received Scale. The results of zero-order Pearson correlational analyses provided support for the hypothesis, whereas the results of MANOVAs tests only partially support extant grandparent-grandchild theory and research. These findings broaden the scope of Affection Exchange Theory and also bolster the GRAS's utility in future grandparent-grandchild affectionate communication research.
Duane, Barbara T; Satre, Maria E
2014-01-01
In nursing education, students participate in individual learner testing. This process follows the instructionist learning theory of a system model. However, in the practice of nursing, success depends upon collaboration with numerous people in different capacities, critical thinking, clinical reasoning, and the ability to communicate with others. Research has shown that collaborative testing, a constructivism learning activity and a form of collaborative learning, enhances students' abilities to master these areas. Collaborative testing is a clear, creative strategy which constructivists would say supports the socio-linguistic base of their learning theory. The test becomes an active implementation of peer-mediated learning where individual knowledge is enhanced through problem solving or defense of an individual position with the collaborative method. There is criticism for the testing method's potential of grade inflation and for students to receive grade benefits with little effort. After a review of various collaborative testing methods, this nursing faculty implemented a collaborative testing format that addresses both the positive and negative aspects of the process. Copyright © 2013 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Cooper, F.
1996-01-01
We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations
Experimental test of the renormalizability consequences of the standard electroweak theory
International Nuclear Information System (INIS)
Bardin, D.Yu.
1984-01-01
The present status of the one-loop radiative corrections calculations in the standard electroweak theory is discussed. The possibilities of experimental tests of the higher order predictions of the standard theory is analysed in view of recent data, inclu-ing CERN p anti pcolcider data on Msub(W) and Msub(Z) measurement. The perspectives of these tests in the near future experiments are discussed
SIERO, FW; DOOSJE, BJ
1993-01-01
An experiment was conducted to examine the influence of the perceived extremity of a message and motivation to elaborate upon the process of persuasion. The first goal was to test a model of attitude change relating Social Judgment Theory to the Elaboration Likelihood Model. The second objective was
Decorated tensor network renormalization for lattice gauge theories and spin foam models
International Nuclear Information System (INIS)
Dittrich, Bianca; Mizera, Sebastian; Steinhaus, Sebastian
2016-01-01
Tensor network techniques have proved to be powerful tools that can be employed to explore the large scale dynamics of lattice systems. Nonetheless, the redundancy of degrees of freedom in lattice gauge theories (and related models) poses a challenge for standard tensor network algorithms. We accommodate for such systems by introducing an additional structure decorating the tensor network. This allows to explicitly preserve the gauge symmetry of the system under coarse graining and straightforwardly interpret the fixed point tensors. We propose and test (for models with finite Abelian groups) a coarse graining algorithm for lattice gauge theories based on decorated tensor networks. We also point out that decorated tensor networks are applicable to other models as well, where they provide the advantage to give immediate access to certain expectation values and correlation functions. (paper)
Decorated tensor network renormalization for lattice gauge theories and spin foam models
Dittrich, Bianca; Mizera, Sebastian; Steinhaus, Sebastian
2016-05-01
Tensor network techniques have proved to be powerful tools that can be employed to explore the large scale dynamics of lattice systems. Nonetheless, the redundancy of degrees of freedom in lattice gauge theories (and related models) poses a challenge for standard tensor network algorithms. We accommodate for such systems by introducing an additional structure decorating the tensor network. This allows to explicitly preserve the gauge symmetry of the system under coarse graining and straightforwardly interpret the fixed point tensors. We propose and test (for models with finite Abelian groups) a coarse graining algorithm for lattice gauge theories based on decorated tensor networks. We also point out that decorated tensor networks are applicable to other models as well, where they provide the advantage to give immediate access to certain expectation values and correlation functions.
A Test of Durkheim's Theory of Suicide in Primitive Societies.
Lester, David
1992-01-01
Classified primitive societies as high, moderate, or low on independent measures of social integration and social regulation to test Durkheim's theory of suicide. Estimated frequency of suicide did not differ between those societies predicted to have high, moderate, and low suicide rates. Durkheim's theory was not confirmed. (Author/NB)
M-Theory Model-Building and Proton Stability
Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.
1998-01-01
We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.
M-theory model-building and proton stability
International Nuclear Information System (INIS)
Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens
1997-09-01
The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory
Extended Nambu models: Their relation to gauge theories
Escobar, C. A.; Urrutia, L. F.
2017-05-01
Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.
Supersymmetry and String Theory: Beyond the Standard Model
International Nuclear Information System (INIS)
Rocek, Martin
2007-01-01
When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)
Development of a theory of implementation and integration: Normalization Process Theory
Directory of Open Access Journals (Sweden)
May Carl R
2009-05-01
Full Text Available Abstract Background Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built. Methods Between 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model. Results Each phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory. Conclusion Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
Item Response Theory Analyses of the Cambridge Face Memory Test (CFMT)
Cho, Sun-Joo; Wilmer, Jeremy; Herzmann, Grit; McGugin, Rankin; Fiset, Daniel; Van Gulick, Ana E.; Ryan, Katie; Gauthier, Isabel
2014-01-01
We evaluated the psychometric properties of the Cambridge face memory test (CFMT; Duchaine & Nakayama, 2006). First, we assessed the dimensionality of the test with a bi-factor exploratory factor analysis (EFA). This EFA analysis revealed a general factor and three specific factors clustered by targets of CFMT. However, the three specific factors appeared to be minor factors that can be ignored. Second, we fit a unidimensional item response model. This item response model showed that the CFMT items could discriminate individuals at different ability levels and covered a wide range of the ability continuum. We found the CFMT to be particularly precise for a wide range of ability levels. Third, we implemented item response theory (IRT) differential item functioning (DIF) analyses for each gender group and two age groups (Age ≤ 20 versus Age > 21). This DIF analysis suggested little evidence of consequential differential functioning on the CFMT for these groups, supporting the use of the test to compare older to younger, or male to female, individuals. Fourth, we tested for a gender difference on the latent facial recognition ability with an explanatory item response model. We found a significant but small gender difference on the latent ability for face recognition, which was higher for women than men by 0.184, at age mean 23.2, controlling for linear and quadratic age effects. Finally, we discuss the practical considerations of the use of total scores versus IRT scale scores in applications of the CFMT. PMID:25642930
An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories
International Nuclear Information System (INIS)
Schiappa, Ricardo; Wyllard, Niclas
2010-01-01
We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.
Matrix models as non-commutative field theories on R3
International Nuclear Information System (INIS)
Livine, Etera R
2009-01-01
In the context of spin foam models for quantum gravity, group field theories are a useful tool allowing on the one hand a non-perturbative formulation of the partition function and on the other hand admitting an interpretation as generalized matrix models. Focusing on 2d group field theories, we review their explicit relation to matrix models and show their link to a class of non-commutative field theories invariant under a quantum-deformed 3d Poincare symmetry. This provides a simple relation between matrix models and non-commutative geometry. Moreover, we review the derivation of effective 2d group field theories with non-trivial propagators from Boulatov's group field theory for 3d quantum gravity. Besides the fact that this gives a simple and direct derivation of non-commutative field theories for the matter dynamics coupled to (3d) quantum gravity, these effective field theories can be expressed as multi-matrix models with a non-trivial coupling between matrices of different sizes. It should be interesting to analyze this new class of theories, both from the point of view of matrix models as integrable systems and for the study of non-commutative field theories.
MODELS AND THE DYNAMICS OF THEORIES
Directory of Open Access Journals (Sweden)
Paulo Abrantes
2007-12-01
Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.
Tests of the power PC theory of causal induction with negative contingencies.
Shanks, David R
2002-01-01
The power PC theory of causal induction (Cheng, 1997) proposes that causal estimates are based on the power p of a potential cause, where p is the contingency between the cause and effect normalized by the base rate of the effect. Previous tests of this theory have concentrated on generative causes that have positive contingencies with their associated outcomes. Here we empirically test this theory in two experiments using preventive causes that have negative contingencies for their outcomes. Contrary to the power PC theory, the results show that causal judgments vary with contingency across conditions of constant power p. This pattern is consistent, however, with several alternative accounts of causal judgment.
Generalisation of the test theory of special relativity to non-inertial frames
International Nuclear Information System (INIS)
Abolghasem, G.H.; Khajehpour, M.R.H.; Mansouri, R.
1989-01-01
We present a generalised test theory of special relativity, using a non-inertial frame. Within the framework of the special theory of relativity the transport and Einstein synchronisations are equivalent on a rigidly rotating disc. But in any theory with a preferred frame, such an equivalence does not hold. The time difference resulting from the two synchronisation procedures is a measurable quantity within the reach of existing clock systems on the Earth. The final result contains a term which depends on the angular velocity of the rotating system, and hence measures an absolute effect. This term is of crucial importance in our test theory of special relativity. (Author)
International Nuclear Information System (INIS)
Johnson, C.R.
1986-01-01
In a previous paper (paper I), we developed a method for finding the exact equations of structure and motion of multipole test particles in Einstein's unified field theory: the theory of the nonsymmetric field. In that paper we also applied the method and found in Einstein's unified field theory the equations of structure and motion of neutral pole-dipole test particles possessing no electromagnetic multipole moments. In a second paper (paper II), we applied the method and found in Einstein's unified field theory the exact equations of structure and motion of charged test particles possessing no magnetic monopole moments. In the present paper (paper III), we apply the method and find in Einstein's unified field theory the exact equations of structure and motion of charged test particles possessing magnetic monopole moments. It follows from the form of these equations of structure and motion that in general in Einstein's unified field theory a test particle possessing a magnetic monopole moment in a background electromagnetic field must also possess spin
The Properties of Model Selection when Retaining Theory Variables
DEFF Research Database (Denmark)
Hendry, David F.; Johansen, Søren
Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....
Topological quantum theories and integrable models
International Nuclear Information System (INIS)
Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.
1991-01-01
The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit
Polyacetylene and relativistic field-theory models
International Nuclear Information System (INIS)
Bishop, A.R.; Campbell, D.K.; Fesser, K.
1981-01-01
Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed
Modeling Routinization in Games: An Information Theory Approach
DEFF Research Database (Denmark)
Wallner, Simon; Pichlmair, Martin; Hecher, Michael
2015-01-01
Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...
Electroweak theory and the Standard Model
CERN. Geneva; Giudice, Gian Francesco
2004-01-01
There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.
Analytical Model for the End-Bearing Capacity of Tapered Piles Using Cavity Expansion Theory
Directory of Open Access Journals (Sweden)
Suman Manandhar
2012-01-01
Full Text Available On the basis of evidence from model tests on increasing the end-bearing behavior of tapered piles at the load-settlement curve, this paper proposes an analytical spherical cavity expansion theory to evaluate the end-bearing capacity. The angle of tapering is inserted in the proposed model to evaluate the end-bearing capacity. The test results of the proposed model in different types of sands and different relative densities show good effects compared to conventional straight piles. The end-bearing capacity increases with increases in the tapering angle. The paper then propounds a model for prototypes and real-type pile tests which predicts and validates to evaluate the end-bearing capacity.
Xu, Jian
2017-01-01
The present study investigated test-taking motivation in L2 listening testing context by applying Expectancy-Value Theory as the framework. Specifically, this study was intended to examine the complex relationships among expectancy, importance, interest, listening anxiety, listening metacognitive awareness, and listening test score using data from a large-scale and high-stakes language test among Chinese first-year undergraduates. Structural equation modeling was used to examine the mediating effect of listening metacognitive awareness on the relationship between expectancy, importance, interest, listening anxiety, and listening test score. According to the results, test takers' listening scores can be predicted by expectancy, interest, and listening anxiety significantly. The relationship between expectancy, interest, listening anxiety, and listening test score was mediated by listening metacognitive awareness. The findings have implications for test takers to improve their test taking motivation and listening metacognitive awareness, as well as for L2 teachers to intervene in L2 listening classrooms.
Directory of Open Access Journals (Sweden)
Jian Xu
2017-12-01
Full Text Available The present study investigated test-taking motivation in L2 listening testing context by applying Expectancy-Value Theory as the framework. Specifically, this study was intended to examine the complex relationships among expectancy, importance, interest, listening anxiety, listening metacognitive awareness, and listening test score using data from a large-scale and high-stakes language test among Chinese first-year undergraduates. Structural equation modeling was used to examine the mediating effect of listening metacognitive awareness on the relationship between expectancy, importance, interest, listening anxiety, and listening test score. According to the results, test takers’ listening scores can be predicted by expectancy, interest, and listening anxiety significantly. The relationship between expectancy, interest, listening anxiety, and listening test score was mediated by listening metacognitive awareness. The findings have implications for test takers to improve their test taking motivation and listening metacognitive awareness, as well as for L2 teachers to intervene in L2 listening classrooms.
Banas, Kasia; Lyimo, Ramsey A; Hospers, Harm J; van der Ven, Andre; de Bruin, Marijn
2017-10-01
Combination antiretroviral therapy (cART) for HIV is widely available in sub-Saharan Africa. Adherence is crucial to successful treatment. This study aimed to apply an extended theory of planned behaviour (TPB) model to predict objectively measured adherence to cART in Tanzania. Prospective observational study (n = 158) where patients completed questionnaires on demographics (Month 0), socio-cognitive variables including intentions (Month 1), and action planning and self-regulatory processes hypothesised to mediate the intention-behaviour relationship (Month 3), to predict adherence (Month 5). Taking adherence was measured objectively using the Medication Events Monitoring System (MEMS) caps. Model tests were conducted using regression and bootstrap mediation analyses. Perceived behavioural control (PBC) was positively (β = .767, p behavioural measure, identified PBC as the main driver of adherence intentions. The effect of intentions on adherence was only indirect through self-regulatory processes, which were the main predictor of objectively assessed adherence.
Testing and Inference in Nonlinear Cointegrating Vector Error Correction Models
DEFF Research Database (Denmark)
Kristensen, Dennis; Rahbæk, Anders
In this paper, we consider a general class of vector error correction models which allow for asymmetric and non-linear error correction. We provide asymptotic results for (quasi-)maximum likelihood (QML) based estimators and tests. General hypothesis testing is considered, where testing...... of non-stationary non-linear time series models. Thus the paper provides a full asymptotic theory for estimators as well as standard and non-standard test statistics. The derived asymptotic results prove to be new compared to results found elsewhere in the literature due to the impact of the estimated...... symmetric non-linear error correction considered. A simulation study shows that the fi…nite sample properties of the bootstrapped tests are satisfactory with good size and power properties for reasonable sample sizes....
Testing and Inference in Nonlinear Cointegrating Vector Error Correction Models
DEFF Research Database (Denmark)
Kristensen, Dennis; Rahbek, Anders
In this paper, we consider a general class of vector error correction models which allow for asymmetric and non-linear error correction. We provide asymptotic results for (quasi-)maximum likelihood (QML) based estimators and tests. General hypothesis testing is considered, where testing...... of non-stationary non-linear time series models. Thus the paper provides a full asymptotic theory for estimators as well as standard and non-standard test statistics. The derived asymptotic results prove to be new compared to results found elsewhere in the literature due to the impact of the estimated...... symmetric non-linear error correction are considered. A simulation study shows that the finite sample properties of the bootstrapped tests are satisfactory with good size and power properties for reasonable sample sizes....
Monte Carlo tests of the Rasch model based on scalability coefficients
DEFF Research Database (Denmark)
Christensen, Karl Bang; Kreiner, Svend
2010-01-01
that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local dependence......For item responses fitting the Rasch model, the assumptions underlying the Mokken model of double monotonicity are met. This makes non-parametric item response theory a natural starting-point for Rasch item analysis. This paper studies scalability coefficients based on Loevinger's H coefficient...
Methodological issues in testing the marginal productivity theory
P.T. Gottschalk (Peter); J. Tinbergen (Jan)
1982-01-01
textabstractPrevious tests of the marginal productivity theory have been criticized on several grounds reviewed by the authors. One important deficiency has been the small number of factor inputs entered in the production functions. In 1978 Gottschalk suggested a method to estimate production
Vacation queueing models theory and applications
Tian, Naishuo
2006-01-01
A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...
Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its stru....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its......, it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well...
Toda theories, W-algebras, and minimal models
International Nuclear Information System (INIS)
Mansfield, P.; Spence, B.
1991-01-01
We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)
Disrupted cortical connectivity theory as an explanatory model for autism spectrum disorders
Kana, Rajesh K.; Libero, Lauren E.; Moore, Marie S.
2011-12-01
Recent findings of neurological functioning in autism spectrum disorder (ASD) point to altered brain connectivity as a key feature of its pathophysiology. The cortical underconnectivity theory of ASD (Just et al., 2004) provides an integrated framework for addressing these new findings. This theory suggests that weaker functional connections among brain areas in those with ASD hamper their ability to accomplish complex cognitive and social tasks successfully. We will discuss this theory, but will modify the term underconnectivity to ‘disrupted cortical connectivity’ to capture patterns of both under- and over-connectivity in the brain. In this paper, we will review the existing literature on ASD to marshal supporting evidence for hypotheses formulated on the disrupted cortical connectivity theory. These hypotheses are: 1) underconnectivity in ASD is manifested mainly in long-distance cortical as well as subcortical connections rather than in short-distance cortical connections; 2) underconnectivity in ASD is manifested only in complex cognitive and social functions and not in low-level sensory and perceptual tasks; 3) functional underconnectivity in ASD may be the result of underlying anatomical abnormalities, such as problems in the integrity of white matter; 4) the ASD brain adapts to underconnectivity through compensatory strategies such as overconnectivity mainly in frontal and in posterior brain areas. This may be manifested as deficits in tasks that require frontal-parietal integration. While overconnectivity can be tested by examining the cortical minicolumn organization, long-distance underconnectivity can be tested by cognitively demanding tasks; and 5) functional underconnectivity in brain areas in ASD will be seen not only during complex tasks but also during task-free resting states. We will also discuss some empirical predictions that can be tested in future studies, such as: 1) how disrupted connectivity relates to cognitive impairments in skills
Haberman, Shelby J; Sinharay, Sandip; Chon, Kyong Hee
2013-07-01
Residual analysis (e.g. Hambleton & Swaminathan, Item response theory: principles and applications, Kluwer Academic, Boston, 1985; Hambleton, Swaminathan, & Rogers, Fundamentals of item response theory, Sage, Newbury Park, 1991) is a popular method to assess fit of item response theory (IRT) models. We suggest a form of residual analysis that may be applied to assess item fit for unidimensional IRT models. The residual analysis consists of a comparison of the maximum-likelihood estimate of the item characteristic curve with an alternative ratio estimate of the item characteristic curve. The large sample distribution of the residual is proved to be standardized normal when the IRT model fits the data. We compare the performance of our suggested residual to the standardized residual of Hambleton et al. (Fundamentals of item response theory, Sage, Newbury Park, 1991) in a detailed simulation study. We then calculate our suggested residuals using data from an operational test. The residuals appear to be useful in assessing the item fit for unidimensional IRT models.
The Standard Model is Natural as Magnetic Gauge Theory
DEFF Research Database (Denmark)
Sannino, Francesco
2011-01-01
matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...
Precision tests of quantum chromodynamics and the standard model
International Nuclear Information System (INIS)
Brodsky, S.J.; Lu, H.J.
1995-06-01
The authors discuss three topics relevant to testing the Standard Model to high precision: commensurate scale relations, which relate observables to each other in perturbation theory without renormalization scale or scheme ambiguity, the relationship of compositeness to anomalous moments, and new methods for measuring the anomalous magnetic and quadrupole moments of the W and Z
International Nuclear Information System (INIS)
Langacker, P.
1981-01-01
In this talk I discuss the present status of these theories and of their observational and experimental implications. In section II, I briefly review the standard SU 3 sup(c) x SU 2 x U 1 model of the strong and electroweak interactions. Although phenomenologically successful, the standard model leaves many questions unanswered. Some of these questions are addressed by grand unified theories, which are defined and discussed in Section III. The Georgi-Glashow SU 5 model is described, as are theories based on larger groups such as SO 10 , E 6 , or SO 16 . It is emphasized that there are many possible grand unified theories and that it is an experimental problem not only to test the basic ideas but to discriminate between models. (orig./HSI)
Plotnikoff, Ronald C; Lubans, David R; Penfold, Chris M; Courneya, Kerry S
2014-05-01
Theory-based interventions to promote physical activity (PA) are more effective than atheoretical approaches; however, the comparative utility of theoretical models is rarely tested in longitudinal designs with multiple time points. Further, there is limited research that has simultaneously tested social-cognitive models with self-report and objective PA measures. The primary aim of this study was to test the predictive ability of three theoretical models (social cognitive theory, theory of planned behaviour, and protection motivation theory) in explaining PA behaviour. Participants were adults with type 2 diabetes (n = 287, 53.8% males, mean age = 61.6 ± 11.8 years). Theoretical constructs across the three theories were tested to prospectively predict PA behaviour (objective and self-report) across three 6-month time intervals (baseline-6, 6-12, 12-18 months) using structural equation modelling. PA outcomes were steps/3 days (objective) and minutes of MET-weighted PA/week (self-report). The mean proportion of variance in PA explained by these models was 6.5% for objective PA and 8.8% for self-report PA. Direct pathways to PA outcomes were stronger for self-report compared with objective PA. These theories explained a small proportion of the variance in longitudinal PA studies. Theory development to guide interventions for increasing and maintaining PA in adults with type 2 diabetes requires further research with objective measures. Theory integration across social-cognitive models and the inclusion of ecological levels are recommended to further explain PA behaviour change in this population. Statement of contribution What is already known on this subject? Social-cognitive theories are able to explain partial variance for physical activity (PA) behaviour. What does this study add? The testing of three theories in a longitudinal design over 3, 6-month time intervals. The parallel use and comparison of both objective and self-report PA measures in testing these
Applying complexity theory: A primer for identifying and modeling firm anomalies
Directory of Open Access Journals (Sweden)
Arch G. Woodside
2018-01-01
Full Text Available This essay elaborates on the usefulness of embracing complexity theory, modeling outcomes rather than directionality, and modeling complex rather than simple outcomes in strategic management. Complexity theory includes the tenet that most antecedent conditions are neither sufficient nor necessary for the occurrence of a specific outcome. Identifying a firm by individual antecedents (i.e., non-innovative versus highly innovative, small versus large size in sales or number of employees, or serving local versus international markets provides shallow information in modeling specific outcomes (e.g., high sales growth or high profitability—even if directional analyses (e.g., regression analysis, including structural equation modeling indicates that the independent (main effects of the individual antecedents relate to outcomes directionally—because firm (case anomalies almost always occur to main effects. Examples: a number of highly innovative firms have low sales while others have high sales and a number of non-innovative firms have low sales while others have high sales. Breaking-away from the current dominant logic of directionality testing—null hypotheses statistic testing (NHST—to embrace somewhat precise outcome testing (SPOT is necessary for extracting highly useful information about the causes of anomalies—associations opposite to expected and “statistically significant” main effects. The study of anomalies extends to identifying the occurrences of four-corner strategy outcomes: firms doing well in favorable circumstances, firms doing badly in favorable circumstances, firms doing well in unfavorable circumstances, and firms doing badly in unfavorable circumstances. Models of four-corner strategy outcomes advances strategic management beyond the current dominant logic of directional modeling of single outcomes.
Testing quantity theory of money for the Turkish economy
Levent, Korap
2007-01-01
In this paper, it is tried to test the main assumptions of the Quantity Theory of Money for the Turkish economy. Using some contemporaneous estimation techniques to examine the long-run stationary economic relationships on which the quantity theory is constructed, it is found that stationary characteristics of the velocitities of narrowly and broadly defined monetary aggregates cannot be rejected. However, monetary aggregates seem to be endogenous for the long-run evoluation of prices and rea...
Halbwachs and Durkheim: a test of two theories of suicide.
Travis, R
1990-06-01
The social integration hypothesis forms the basis of this study. It was first asserted by Durkheim in late nineteenth-century France and many of his assumptions are based on a social disorganizational model. This model tended to equate social change with the breakdown of social control and many of Durkheim's notions about anomie are derived from this view of industrial society. Halbwachs, on the other hand, proposed a social psychological theory of suicide. His model specifies more clearly the conditions under which lack of social integration may induce suicide. This study shows that among a population in transition, the Alaska Natives, the suicide rate was explained by the Halbwachsian model at least as well as the Durkheimian one and sometimes better. The Durkheimian model is shown to reflect a Cartesian dualism, which accounts only for that which is observable, thus making for biased studies of suicide. Moreover, psychopathological research confirms the Halbwachsian model. These findings restore the social isolation theory, once long neglected, to its rightful place among theories of suicide and opens up an important field for researchers seeking to understand high rates of suicide.
Testing In College Admissions: An Alternative to the Traditional Predictive Model.
Lunneborg, Clifford E.
1982-01-01
A decision-making or utility theory model (which deals effectively with affirmative action goals and allows standardized tests to be placed in the service of those goals) is discussed as an alternative to traditional predictive admissions. (Author/PN)
Conceptual Models and Theory-Embedded Principles on Effective Schooling.
Scheerens, Jaap
1997-01-01
Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…
Contribution to the study of conformal theories and integrable models
International Nuclear Information System (INIS)
Sochen, N.
1992-05-01
The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved
Tsai, Chung-Hung
2014-05-07
Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Directory of Open Access Journals (Sweden)
Chung-Hung Tsai
2014-05-01
Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
DEFF Research Database (Denmark)
Hirst, Andrew G.; Glazier, Douglas S.; Atkinson, David
2014-01-01
Metabolism fuels all of life’s activities, from biochemical reactions to ecological interactions. According to two intensely debated theories, body size affects metabolism via geometrical influences on the transport of resources and wastes. However, these theories differ crucially in whether...... the size dependence of metabolism is derived from material transport across external surfaces, or through internal resource-transport networks. We show that when body shape changes during growth, these models make opposing predictions. These models are tested using pelagic invertebrates, because...... these animals exhibit highly variable intraspecific scaling relationships for metabolic rate and body shape. Metabolic scaling slopes of diverse integument-breathing species were significantly positively correlated with degree of body flattening or elongation during ontogeny, as expected from surface area...
Staircase Models from Affine Toda Field Theory
Dorey, P; Dorey, Patrick; Ravanini, Francesco
1993-01-01
We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.
Directory of Open Access Journals (Sweden)
Ntogwa Ng'habi Bundala
2012-01-01
Full Text Available The empirical study was focused predominantly on validity tests of the three theories on capital structures, the static trade-off theory, the pecking order theory (information asymmetry theory, and agency cost theory in the Tanzanian context. The study used secondary data from eight of the non-financial companies listed in Dar Es Salaam Stock Exchange (DSE from 2006-2012. The study used descriptive (quantitative approach to test the practicality of the theories in Tanzania. The multiple regressions model used to test the theoretical relationship between the financial leverage and characteristics of the company. The research found that there is no strong evidence for validation of static trade off theory, little support of pecking order theory, but the agency cost theory is confirmed to be valid and practiced in Tanzania. It recommended that Tanzanian companies should be adhering to the determinants of the capital structure in the Tanzanian context found by this study.
The director task: A test of Theory-of-Mind use or selective attention?
Rubio-Fernández, Paula
2017-08-01
Over two decades, the director task has increasingly been employed as a test of the use of Theory of Mind in communication, first in psycholinguistics and more recently in social cognition research. A new version of this task was designed to test two independent hypotheses. First, optimal performance in the director task, as established by the standard metrics of interference, is possible by using selective attention alone, and not necessarily Theory of Mind. Second, pragmatic measures of Theory-of-Mind use can reveal that people actively represent the director's mental states, contrary to recent claims that they only use domain-general cognitive processes to perform this task. The results of this study support both hypotheses and provide a new interactive paradigm to reliably test Theory-of-Mind use in referential communication.
Irreducible integrable theories form tensor products of conformal models
International Nuclear Information System (INIS)
Mathur, S.D.; Warner, N.P.
1991-01-01
By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)
Simple Theory for the Dynamics of Mean-Field-Like Models of Glass-Forming Fluids
Szamel, Grzegorz
2017-10-01
We propose a simple theory for the dynamics of model glass-forming fluids, which should be solvable using a mean-field-like approach. The theory is based on transparent physical assumptions, which can be tested in computer simulations. The theory predicts an ergodicity-breaking transition that is identical to the so-called dynamic transition predicted within the replica approach. Thus, it can provide the missing dynamic component of the random first order transition framework. In the large-dimensional limit the theory reproduces the result of a recent exact calculation of Maimbourg et al. [Phys. Rev. Lett. 116, 015902 (2016), 10.1103/PhysRevLett.116.015902]. Our approach provides an alternative, physically motivated derivation of this result.
Hermes, Matthew R.; Dukelsky, Jorge; Scuseria, Gustavo E.
2017-06-01
The failures of single-reference coupled-cluster theory for strongly correlated many-body systems is flagged at the mean-field level by the spontaneous breaking of one or more physical symmetries of the Hamiltonian. Restoring the symmetry of the mean-field determinant by projection reveals that coupled-cluster theory fails because it factorizes high-order excitation amplitudes incorrectly. However, symmetry-projected mean-field wave functions do not account sufficiently for dynamic (or weak) correlation. Here we pursue a merger of symmetry projection and coupled-cluster theory, following previous work along these lines that utilized the simple Lipkin model system as a test bed [J. Chem. Phys. 146, 054110 (2017), 10.1063/1.4974989]. We generalize the concept of a symmetry-projected mean-field wave function to the concept of a symmetry projected state, in which the factorization of high-order excitation amplitudes in terms of low-order ones is guided by symmetry projection and is not exponential, and combine them with coupled-cluster theory in order to model the ground state of the Agassi Hamiltonian. This model has two separate channels of correlation and two separate physical symmetries which are broken under strong correlation. We show how the combination of symmetry collective states and coupled-cluster theory is effective in obtaining correlation energies and order parameters of the Agassi model throughout its phase diagram.
Testing oligopoly theory in the Lab
Georgantzis, Nikolaos
2006-01-01
Previous experimental results are reviewed to address the extent to which oligopolistic equilibria are good predictors of behavior observed in labe ratory experiments with human agents. Although the theory is unrealistically demanding with respect to the agents7 informational and rational endowments, experimental results obtained in more realistic settings with subjects using trial-anderror decision mechanisms tend to confirm predictions of simple symmetric theoretical models. However, in the...
The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives
Badesa, Calixto
2008-01-01
Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali
Models and theories of prescribing decisions: A review and suggested a new model.
Murshid, Mohsen Ali; Mohaidin, Zurina
2017-01-01
To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.
An experimental test of control theory-based interventions for physical activity.
Prestwich, Andrew; Conner, Mark; Hurling, Robert; Ayres, Karen; Morris, Ben
2016-11-01
To provide an experimental test of control theory to promote physical activity. Parallel groups, simple randomized design with an equal chance of allocation to any group. Participants not meeting recommended levels of physical activity but physically safe to do so (N = 124) were recruited on a UK university campus and randomized to goal-setting + self-monitoring + feedback (GS + SM + F, n = 40), goal-setting + self-monitoring (GS + SM, n = 40), or goal-setting only (GS, n = 44) conditions that differentially tapped the key features of control theory. Accelerometers assessed physical activity (primary outcome) as well as self-report over a 7-day period directly before/after the start of the intervention. The participants in the GS + SM + F condition significantly outperformed those in the GS condition, d = 0.62, 95% CI d = 0.15-1.08, and marginally outperformed those in the GS + SM condition in terms of total physical activity at follow-up on the accelerometer measure, d = 0.33, 95% CI d = -0.13 to 0.78. The feedback manipulation (GS + SM + F vs. GS + SM and GS) was most effective when baseline intentions were weak. These patterns did not emerge on the self-report measure but, on the basis of this measure, the feedback manipulation increased the risk that participants coasted in relation to their goal in the first few days of the intervention period. Using behaviour change techniques consistent with control theory can lead to significant short-term improvements on objectively assessed physical activity. Further research is needed to examine the underlying theoretical principles of the model. Statement of contribution What is already known on this subject? Interventions incorporating more techniques that are consistent with control theory are associated with larger positive changes in health behaviours and related outcomes (see reviews by Dombrowski et al., ; Michie et al., ). However, none of the studies included in these
Directory of Open Access Journals (Sweden)
Michael H. Birnbaum
2012-07-01
Full Text Available Individual true and error theory assumes that responses by the same person to the same choice problem within a block of trials are based on the same true preferences but may show preference reversals due to random error. Between blocks, a person{}'s true preferences may differ or stay the same. This theory is illustrated with studies testing two critical properties that distinguish models of risky decision making: (1 restricted branch independence, which is implied by original prospect theory and violated in a specific way by both cumulative prospect theory and the priority heuristic; and (2 stochastic dominance, which is implied by cumulative prospect theory. Corrected for random error, most individuals systematically violated stochastic dominance, ruling out cumulative prospect theory. Furthermore, most people violated restricted branch independence in the opposite way predicted by that theory and the priority heuristic. Both violations are consistent with the transfer of attention exchange model. No one was found whose data were compatible with cumulative prospect theory, except for those that were also compatible with expected utility, and no one satisfied the priority heuristic.
Testing the Neutral Theory of Biodiversity with Human Microbiome Datasets.
Li, Lianwei; Ma, Zhanshan Sam
2016-08-16
The human microbiome project (HMP) has made it possible to test important ecological theories for arguably the most important ecosystem to human health-the human microbiome. Existing limited number of studies have reported conflicting evidence in the case of the neutral theory; the present study aims to comprehensively test the neutral theory with extensive HMP datasets covering all five major body sites inhabited by the human microbiome. Utilizing 7437 datasets of bacterial community samples, we discovered that only 49 communities (less than 1%) satisfied the neutral theory, and concluded that human microbial communities are not neutral in general. The 49 positive cases, although only a tiny minority, do demonstrate the existence of neutral processes. We realize that the traditional doctrine of microbial biogeography "Everything is everywhere, but the environment selects" first proposed by Baas-Becking resolves the apparent contradiction. The first part of Baas-Becking doctrine states that microbes are not dispersal-limited and therefore are neutral prone, and the second part reiterates that the freely dispersed microbes must endure selection by the environment. Therefore, in most cases, it is the host environment that ultimately shapes the community assembly and tip the human microbiome to niche regime.
Astrophysical tests for the Novello-De Lorenci-Luciane theory of gravity
International Nuclear Information System (INIS)
Mosquera Cuesta, H.J.
2001-01-01
The Novello-DeLorenci-Luciane (NDL) field theory of gravitation predicts that gravitational waves (GWs) follow geodesics of a modified (effective) geometry with a speed lower than the velocity of light. The theory also demonstrates that GWs exhibit the phenomenon of birefringence, formerly believed to be exclusive of electromagnetic waves. Here prospective astrophysical tests of these predictions are proposed. I point out that future measurements of gravitational waves in coincidence with a non-gravitational process such as a neutrino burst (and likely a burst of gamma-rays) may prove useful to discriminate among all the existing theories of gravity. It is also stressed that microlensing of gravitational waves emitted by known galactic sources (i.e., pulsars) in the bulge, lensed by either the Galaxy's central black hole (Sgr A*) or a MACHO object adrift among the Milky Way's stars, may provide a clean test of the birefringence phenomenon implied by the NDL gravity theory. (author)
Theory and model use in social marketing health interventions.
Luca, Nadina Raluca; Suggs, L Suzanne
2013-01-01
The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
Robust Measurement via A Fused Latent and Graphical Item Response Theory Model.
Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang
2018-03-12
Item response theory (IRT) plays an important role in psychological and educational measurement. Unlike the classical testing theory, IRT models aggregate the item level information, yielding more accurate measurements. Most IRT models assume local independence, an assumption not likely to be satisfied in practice, especially when the number of items is large. Results in the literature and simulation studies in this paper reveal that misspecifying the local independence assumption may result in inaccurate measurements and differential item functioning. To provide more robust measurements, we propose an integrated approach by adding a graphical component to a multidimensional IRT model that can offset the effect of unknown local dependence. The new model contains a confirmatory latent variable component, which measures the targeted latent traits, and a graphical component, which captures the local dependence. An efficient proximal algorithm is proposed for the parameter estimation and structure learning of the local dependence. This approach can substantially improve the measurement, given no prior information on the local dependence structure. The model can be applied to measure both a unidimensional latent trait and multidimensional latent traits.
Forced vibration tests of a model foundation on rock ground
International Nuclear Information System (INIS)
Kisaki, N.; Siota, M.; Yamada, M.; Ikeda, A.; Tsuchiya, H.; Kitazawa, K.; Kuwabara, Y.; Ogiwara, Y.
1983-01-01
The response of very stiff structures, such as nuclear reactor buildings, to earthquake ground motion is significantly affected by radiation damping due to the soil-structure interaction. The radiation damping can be computed by vibration admittance theory or dynamical ground compliance theory. In order to apply the values derived from these theories to the practical problems, comparative studies between theoretical results and experimental results concerning the soil-structure interaction, especially if the ground is rock, are urgently needed. However, experimental results for rock are less easily obtained than theoretical ones. The purpose of this paper is to describe the harmonic excitation tests of a model foundation on rock and to describe the results of comparative studies. (orig./HP)
Towards strong field tests of beyond Horndeski gravity theories
Sakstein, Jeremy; Babichev, Eugeny; Koyama, Kazuya; Langlois, David; Saito, Ryo
2017-03-01
Theories of gravity in the beyond Horndeski class encompass a wide range of scalar-tensor theories that will be tested on cosmological scales over the coming decade. In this work, we investigate the possibility of testing them in the strong field regime by looking at the properties of compact objects—neutron, hyperon, and quark stars—embedded in an asymptotically de Sitter space-time, for a specific subclass of theories. We extend previous works to include slow rotation and find a relation between the dimensionless moment of inertia (I ¯ =I c2/GNM3 ) and the compactness C =GNM /R c2 (an I ¯-C relation), independent of the equation of state, that is reminiscent of but distinct from the general relativity prediction. Several of our equations of state contain hyperons and free quarks, allowing us to revisit the hyperon puzzle. We find that the maximum mass of hyperon stars can be larger than 2 M⊙ for small values of the beyond Horndeski parameter, thus providing a resolution of the hyperon puzzle based on modified gravity. Moreover, stable quark stars exist when hyperonic stars are unstable, which means that the phase transition from hyperon to quark stars is predicted just as in general relativity (GR), albeit with larger quark star masses. Two important and potentially observable consequences of some of the theories we consider are the existence of neutron stars in a range of masses significantly higher than in GR and I ¯-C relations that differ from their GR counterparts. In the former case, we find objects that, if observed, could not be accounted for in GR because they violate the usual GR causality condition. We end by discussing several difficult technical issues that remain to be addressed in order to reach more realistic predictions that may be tested using gravitational wave searches or neutron star observations.
Holman, Gordon D.
1989-01-01
The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.
Optimal closed-loop identification test design for internal model control
Zhu, Y.; Bosch, van den P.P.J.
2000-01-01
In this work, optimal closed-loop test design for control is studied. Simple design formulas are derived based on the asymptotic theory of Ljung. The control scheme used is internal model control (IMC) and the design constraint is the power of the process output or that of the reference signal. The
Liu, Xun
2010-01-01
This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…
Are white evangelical Protestants lower class? A partial test of church-sect theory.
Schwadel, Philip
2014-07-01
Testing hypotheses derived from church-sect theory and contemporary research about changes in evangelical Protestants' social status, I use repeated cross-sectional survey data spanning almost four decades to examine changes in the social-class hierarchy of American religious traditions. While there is little change in the social-class position of white evangelical Protestants from the early 1970s to 2010, there is considerable change across birth cohorts. Results from hierarchical age-period-cohort models show: (1) robust, across-cohort declines in social-class differences between white evangelical Protestants and liberal Protestants, affiliates of "other" religions, and the unaffiliated, (2) stability in social-class differences between white evangelical Protestants and moderate, Pentecostal, and nondenominational Protestants, (3) moderate across-cohort growth in social-class differences between white evangelical Protestants and Catholics, and (4) these patterns vary across indicators of social class. The findings in this article provide partial support for church-sect theory as well as other theories of social change that emphasize the pivotal role of generations. Copyright © 2014 Elsevier Inc. All rights reserved.
A general diagnostic model applied to language testing data.
von Davier, Matthias
2008-11-01
Probabilistic models with one or more latent variables are designed to report on a corresponding number of skills or cognitive attributes. Multidimensional skill profiles offer additional information beyond what a single test score can provide, if the reported skills can be identified and distinguished reliably. Many recent approaches to skill profile models are limited to dichotomous data and have made use of computationally intensive estimation methods such as Markov chain Monte Carlo, since standard maximum likelihood (ML) estimation techniques were deemed infeasible. This paper presents a general diagnostic model (GDM) that can be estimated with standard ML techniques and applies to polytomous response variables as well as to skills with two or more proficiency levels. The paper uses one member of a larger class of diagnostic models, a compensatory diagnostic model for dichotomous and partial credit data. Many well-known models, such as univariate and multivariate versions of the Rasch model and the two-parameter logistic item response theory model, the generalized partial credit model, as well as a variety of skill profile models, are special cases of this GDM. In addition to an introduction to this model, the paper presents a parameter recovery study using simulated data and an application to real data from the field test for TOEFL Internet-based testing.
A 'theory of everything'? [Extending the Standard Model
International Nuclear Information System (INIS)
Ross, G.G.
1993-01-01
The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)
Introduction to gauge theories and the Standard Model
de Wit, Bernard
1995-01-01
The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.
Modeling the fluid/soil interface erosion in the Hole Erosion Test
Directory of Open Access Journals (Sweden)
Kissi B.
2012-07-01
Full Text Available Soil erosion is a complex phenomenon which yields at its final stage to insidious fluid leakages under the hydraulic infrastructures known as piping and which are the main cause of their rupture. The Hole Erosion Test is commonly used to quantify the rate of piping erosion. In this work, The Hole Erosion Test is modelled by using Fluent software package. The aim is to predict the erosion rate of soil during the hole erosion test. The renormalization group theory – based k–ε turbulence model equations are used. This modelling makes it possible describing the effect of the clay concentration in flowing water on erosion. Unlike the usual one dimensional models, the proposed modelling shows that erosion is not uniform erosion along the hole length. In particular, the concentration of clay is found to increase noticeably the erosion rate.
Further tests of belief-importance theory.
Directory of Open Access Journals (Sweden)
K V Petrides
Full Text Available Belief-importance (belimp theory hypothesizes that personality traits confer a propensity to perceive convergences or divergences between the belief that we can attain certain goals and the importance that we place on these goals. Belief and importance are conceptualized as two coordinates, together defining the belimp plane. We tested fundamental aspects of the theory using four different planes based on the life domains of appearance, family, financial security, and friendship as well as a global plane combining these four domains. The criteria were from the areas of personality (Big Five and trait emotional intelligence and learning styles. Two hundred and fifty eight participants were allocated into the four quadrants of the belimp plane (Hubris, Motivation, Depression, and Apathy according to their scores on four reliable instruments. Most hypotheses were supported by the data. Results are discussed with reference to the stability of the belimp classifications under different life domains and the relationship of the quadrants with the personality traits that are hypothesized to underpin them.
Effective field theory and the quark model
International Nuclear Information System (INIS)
Durand, Loyal; Ha, Phuoc; Jaczko, Gregory
2001-01-01
We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections
Integrated Moral Conviction Theory of Student Cheating: An Empirical Test
Roberts, Foster; Thomas, Christopher H.; Novicevic, Milorad M.; Ammeter, Anthony; Garner, Bart; Johnson, Paul; Popoola, Ifeoluwa
2018-01-01
In this article, we develop an "integrated moral conviction theory of student cheating" by integrating moral conviction with (a) the dual-process model of Hunt-Vitell's theory that gives primacy to individual ethical philosophies when moral judgments are made and (b) the social cognitive conceptualization that gives primacy to moral…
Testing Theories of Recognition Memory by Predicting Performance Across Paradigms
Smith, David G.; Duncan, Matthew J. J.
2004-01-01
Signal-detection theory (SDT) accounts of recognition judgments depend on the assumption that recognition decisions result from a single familiarity-based process. However, fits of a hybrid SDT model, called dual-process theory (DPT), have provided evidence for the existence of a second, recollection-based process. In 2 experiments, the authors…
Conducting meta-analyses of HIV prevention literatures from a theory-testing perspective.
Marsh, K L; Johnson, B T; Carey, M P
2001-09-01
Using illustrations from HIV prevention research, the current article advocates approaching meta-analysis as a theory-testing scientific method rather than as merely a set of rules for quantitative analysis. Like other scientific methods, meta-analysis has central concerns with internal, external, and construct validity. The focus of a meta-analysis should only rarely be merely describing the effects of health promotion, but rather should be on understanding and explaining phenomena and the processes underlying them. The methodological decisions meta-analysts make in conducting reviews should be guided by a consideration of the underlying goals of the review (e.g., simply effect size estimation or, preferably theory testing). From the advocated perspective that a health behavior meta-analyst should test theory, the authors present a number of issues to be considered during the conduct of meta-analyses.
Lee, Yi-Hsuan; Hsieh, Yi-Chuan; Hsu, Chia-Ning
2011-01-01
This study intends to investigate factors affecting business employees' behavioral intentions to use the e-learning system. Combining the innovation diffusion theory (IDT) with the technology acceptance model (TAM), the present study proposes an extended technology acceptance model. The proposed model was tested with data collected from 552…
Minisuperspace models in histories theory
International Nuclear Information System (INIS)
Anastopoulos, Charis; Savvidou, Ntina
2005-01-01
We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context
Theories of coalition formation: An empirical test using data from Danish local government
DEFF Research Database (Denmark)
Skjæveland, Asbjørn; Serritzlew, Søren; Blom-Hansen, Jens
2007-01-01
Theories of coalition formation represent a diverse set of arguments about why some government coalitions form while others do not. In this article, the authors present a systematic empirical test of the relative importance of the various arguments. The test is designed to avoid a circularity...... problem present in many coalition studies - namely that the theories are tested on data of national government coalitions in postwar Europe: the very data that gave rise to the theories in the first place. Instead, the authors focus on government coalitions at the municipal level. They base their analysis...... on office and policy motives. At the same time, the analysis raises the question of whether actors really seek minimal coalitions....
Proposed experimental test of an alternative electrodynamic theory of superconductors
Energy Technology Data Exchange (ETDEWEB)
Hirsch, J.E., E-mail: jhirsch@ucsd.edu
2015-01-15
Highlights: • A new experimental test of electric screening in superconductors is proposed. • The electric screening length is predicted to be much larger than in normal metals. • The reason this was not seen in earlier experiments is explained. • This is not predicted by the conventional BCS theory of superconductivity. - Abstract: An alternative form of London’s electrodynamic theory of superconductors predicts that the electrostatic screening length is the same as the magnetic penetration depth. We argue that experiments performed to date do not rule out this alternative formulation and propose an experiment to test it. Experimental evidence in its favor would have fundamental implications for the understanding of superconductivity.
Quantum Link Models and Quantum Simulation of Gauge Theories
International Nuclear Information System (INIS)
Wiese, U.J.
2015-01-01
This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)
Item response theory - A first approach
Nunes, Sandra; Oliveira, Teresa; Oliveira, Amílcar
2017-07-01
The Item Response Theory (IRT) has become one of the most popular scoring frameworks for measurement data, frequently used in computerized adaptive testing, cognitively diagnostic assessment and test equating. According to Andrade et al. (2000), IRT can be defined as a set of mathematical models (Item Response Models - IRM) constructed to represent the probability of an individual giving the right answer to an item of a particular test. The number of Item Responsible Models available to measurement analysis has increased considerably in the last fifteen years due to increasing computer power and due to a demand for accuracy and more meaningful inferences grounded in complex data. The developments in modeling with Item Response Theory were related with developments in estimation theory, most remarkably Bayesian estimation with Markov chain Monte Carlo algorithms (Patz & Junker, 1999). The popularity of Item Response Theory has also implied numerous overviews in books and journals, and many connections between IRT and other statistical estimation procedures, such as factor analysis and structural equation modeling, have been made repeatedly (Van der Lindem & Hambleton, 1997). As stated before the Item Response Theory covers a variety of measurement models, ranging from basic one-dimensional models for dichotomously and polytomously scored items and their multidimensional analogues to models that incorporate information about cognitive sub-processes which influence the overall item response process. The aim of this work is to introduce the main concepts associated with one-dimensional models of Item Response Theory, to specify the logistic models with one, two and three parameters, to discuss some properties of these models and to present the main estimation procedures.
Matrix model as a mirror of Chern-Simons theory
International Nuclear Information System (INIS)
Aganagic, Mina; Klemm, Albrecht; Marino, Marcos; Vafa, Cumrun
2004-01-01
Using mirror symmetry, we show that Chern-Simons theory on certain manifolds such as lens spaces reduces to a novel class of Hermitian matrix models, where the measure is that of unitary matrix models. We show that this agrees with the more conventional canonical quantization of Chern-Simons theory. Moreover, large N dualities in this context lead to computation of all genus A-model topological amplitudes on toric Calabi-Yau manifolds in terms of matrix integrals. In the context of type IIA superstring compactifications on these Calabi-Yau manifolds with wrapped D6 branes (which are dual to M-theory on G2 manifolds) this leads to engineering and solving F-terms for N=1 supersymmetric gauge theories with superpotentials involving certain multi-trace operators. (author)
A QCD Model Using Generalized Yang-Mills Theory
International Nuclear Information System (INIS)
Wang Dianfu; Song Heshan; Kou Lina
2007-01-01
Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.
Testing the non-locality of quantum theory in two-kaon systems
Energy Technology Data Exchange (ETDEWEB)
Eberhard, P.H. (California Univ., Berkeley (United States). Lawrence Berkeley Lab.)
1993-06-07
An idea for testing the non-local character of quantum theory in systems made of two neutral kaons is suggested. Such tests require detecting two long-lived or two short-lived neutral kaons in coincidence, when copper slabs are either interposed on or removed from their paths. They may be performed at an asymmetric [Phi][sup 0]-factory. They could answer some questions raised by the EPR paradox and Bell's inequalities. If such tests are performed and if predictions of quantum mechanics and standard theory of kaon regeneration are verified experimentally, all descriptions of the relevant phenomena in terms of local interactions will be ruled out in principle with the exception of very peculiar ones, which imply the existence of hidden variables, of different kinds of kaons corresponding to different values of the hidden variables, and, for some of these kaons, of regeneration probabilities enhanced by a factor of the order of 400 or more over the average. Of course, the experiment may also reveal a break down of quantum theory. (orig.)
Comment on a proposed ''crucial experiment'' to test Einstein's special theory of relativity
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, Jr, W A [Universidade Estadual de Campinas (Brazil); Buonamano, V [Universidade Estadual de Campinas (Brazil). Instituto de Matematica
1976-08-11
A proposed ''crucial experiment'' to test Einstein's special theory of relativity is analysed and it is shown that it falls into the set of unsatisfactory proposals that attempt to make an experimental distinction between Einstein's special theory of relativity and a ''Lorentzian type'' special theory of relativity.
A conceptual model of social entrepreneurial intention based on the social cognitive career theory
Directory of Open Access Journals (Sweden)
Anh T.P. Tran
2017-01-01
Full Text Available Purpose - Entrepreneurial intention plays a major role in entrepreneurship academia and practice. However, little is known about the intentions of entrepreneurs in the social area of venture creation. This paper aims to formulate a well-organized model of social entrepreneurial intention. Design/methodology/approach - The paper draws on intention models in entrepreneurship literature in general and social entrepreneurship in particular to identify gaps. Based on these findings, a new conceptual model is formulated. Findings - There is no research to be found which uses the social cognitive career theory (SCCT to explain about an individual’s intention to become a social entrepreneur, although this theory is recently suggested as an inclusive framework for entrepreneurial intention (Doan Winkel et al., 2011. It is also supportive by the empirical research of Segal et al. (2002. Therefore, a conceptual model of entrepreneurial intention in the field of social entrepreneurship is formulated based on adapting and extending the SCCT. Originality/value - The paper contributes to the social entrepreneurship literature by providing new insights about social entrepreneurial intention. The result has important implications for theory and practice. In theory, it is the first model offering the SCCT as the background of formation for social entrepreneurial intention, with a distinct perspective of social entrepreneurship as a career. It raises a future direction for researchers to test this model. In practice, this framework provides a broad view of factors that could contribute to the success of the would-be a social entrepreneur.
Acoustic results of the Boeing model 360 whirl tower test
Watts, Michael E.; Jordan, David
1990-09-01
An evaluation is presented for whirl tower test results of the Model 360 helicopter's advanced, high-performance four-bladed composite rotor system intended to facilitate over-200-knot flight. During these performance measurements, acoustic data were acquired by seven microphones. A comparison of whirl-tower tests with theory indicate that theoretical prediction accuracies vary with both microphone position and the inclusion of ground reflection. Prediction errors varied from 0 to 40 percent of the measured signal-to-peak amplitude.
Disrupted cortical connectivity theory as an explanatory model for autism spectrum disorders.
Kana, Rajesh K; Libero, Lauren E; Moore, Marie S
2011-12-01
Recent findings of neurological functioning in autism spectrum disorder (ASD) point to altered brain connectivity as a key feature of its pathophysiology. The cortical underconnectivity theory of ASD (Just et al., 2004) provides an integrated framework for addressing these new findings. This theory suggests that weaker functional connections among brain areas in those with ASD hamper their ability to accomplish complex cognitive and social tasks successfully. We will discuss this theory, but will modify the term underconnectivity to 'disrupted cortical connectivity' to capture patterns of both under- and over-connectivity in the brain. In this paper, we will review the existing literature on ASD to marshal supporting evidence for hypotheses formulated on the disrupted cortical connectivity theory. These hypotheses are: 1) underconnectivity in ASD is manifested mainly in long-distance cortical as well as subcortical connections rather than in short-distance cortical connections; 2) underconnectivity in ASD is manifested only in complex cognitive and social functions and not in low-level sensory and perceptual tasks; 3) functional underconnectivity in ASD may be the result of underlying anatomical abnormalities, such as problems in the integrity of white matter; 4) the ASD brain adapts to underconnectivity through compensatory strategies such as overconnectivity mainly in frontal and in posterior brain areas. This may be manifested as deficits in tasks that require frontal-parietal integration. While overconnectivity can be tested by examining the cortical minicolumn organization, long-distance underconnectivity can be tested by cognitively demanding tasks; and 5) functional underconnectivity in brain areas in ASD will be seen not only during complex tasks but also during task-free resting states. We will also discuss some empirical predictions that can be tested in future studies, such as: 1) how disrupted connectivity relates to cognitive impairments in skills such
The window of opportunity: decision theory and the timing of prognostic tests for newborn infants.
Wilkinson, Dominic
2009-11-01
In many forms of severe acute brain injury there is an early phase when prognosis is uncertain, followed later by physiological recovery and the possibility of more certain predictions of future impairment. There may be a window of opportunity for withdrawal of life support early, but if decisions are delayed there is the risk that the patient will survive with severe impairment. In this paper I focus on the example of neonatal encephalopathy and the question of the timing of prognostic tests and decisions to continue or to withdraw life-sustaining treatment. Should testing be performed early or later; and how should parents decide what to do given the conflicting values at stake? I apply decision theory to the problem, using sensitivity analysis to assess how different features of the tests or different values would affect a decision to perform early or late prognostic testing. I draw some general conclusions from this model for decisions about the timing of testing in neonatal encephalopathy. Finally I consider possible solutions to the problem posed by the window of opportunity. Decision theory highlights the costs of uncertainty. This may prompt further research into improving prognostic tests. But it may also prompt us to reconsider our current attitudes towards the palliative care of newborn infants predicted to be severely impaired.
Chakravarty, G. K.; Mohanty, S.; Lambiase, G.
theories when applied to inflation (a rapid expansion of early universe in which primordial gravitational waves might be generated and might still be detectable by the imprint they left or by the ripples that persist today) can have distinct signatures in the Cosmic Microwave Background radiation temperature and polarization anisotropies. We give a review of ΛCDM cosmology and survey the theories of gravity beyond Einstein’s General Relativity, specially which arise from SUGRA, and study the consequences of these theories in the context of inflation and put bounds on the theories and the parameters therein from the observational experiments like PLANCK, Keck/BICEP, etc. The possibility of testing these theories in the near future in CMB observations and new data coming from colliders like the LHC, provides an unique opportunity for constructing verifiable models of particle physics and General Relativity.
Attitudes and exercise adherence: test of the Theories of Reasoned Action and Planned Behaviour.
Smith, R A; Biddle, S J
1999-04-01
Three studies of exercise adherence and attitudes are reported that tested the Theory of Reasoned Action and the Theory of Planned Behaviour. In a prospective study of adherence to a private fitness club, structural equation modelling path analysis showed that attitudinal and social normative components of the Theory of Reasoned Action accounted for 13.1% of the variance in adherence 4 months later, although only social norm significantly predicted intention. In a second study, the Theory of Planned Behaviour was used to predict both physical activity and sedentary behaviour. Path analyses showed that attitude and perceived control, but not social norm, predicted total physical activity. Physical activity was predicted from intentions and control over sedentary behaviour. Finally, an intervention study with previously sedentary adults showed that intentions to be active measured at the start and end of a 10-week intervention were associated with the planned behaviour variables. A multivariate analysis of variance revealed no significant multivariate effects for time on the planned behaviour variables measured before and after intervention. Qualitative data provided evidence that participants had a positive experience on the intervention programme and supported the role of social normative factors in the adherence process.
Analysis of North Korea's Nuclear Tests under Prospect Theory
Energy Technology Data Exchange (ETDEWEB)
Lee, Han Myung; Ryu, Jae Soo; Lee, Kwang Seok; Lee, Dong Hoon; Jun, Eunju; Kim, Mi Jin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2013-10-15
North Korea has chosen nuclear weapons as the means to protect its sovereignty. Despite international society's endeavors and sanctions to encourage North Korea to abandon its nuclear ambition, North Korea has repeatedly conducted nuclear testing. In this paper, the reason for North Korea's addiction to a nuclear arsenal is addressed within the framework of cognitive psychology. The prospect theory addresses an epistemological approach usually overlooked in rational choice theories. It provides useful implications why North Korea, being under a crisis situation has thrown out a stable choice but taken on a risky one such as nuclear testing. Under the viewpoint of prospect theory, nuclear tests by North Korea can be understood as follows: The first nuclear test in 2006 is seen as a trial to escape from loss areas such as financial sanctions and regime threats; the second test in 2009 was interpreted as a consequence of the strategy to recover losses by making a direct confrontation against the United States; and the third test in 2013 was understood as an attempt to strengthen internal solidarity after Kim Jong-eun inherited the dynasty, as well as to enhance bargaining power against the United States. Thus, it can be summarized that Pyongyang repeated its nuclear tests to escape from a negative domain and to settle into a positive one. In addition, in the future, North Korea may not be willing to readily give up its nuclear capabilities to ensure the survival of its own regime.
Theory and experiment in gravitational physics
Will, C. M.
New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....
Reconstructing bidimensional scalar field theory models
International Nuclear Information System (INIS)
Flores, Gabriel H.; Svaiter, N.F.
2001-07-01
In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)
Directory of Open Access Journals (Sweden)
Wolf L Eiserhardt
Full Text Available Water and energy have emerged as the best contemporary environmental correlates of broad-scale species richness patterns. A corollary hypothesis of water-energy dynamics theory is that the influence of water decreases and the influence of energy increases with absolute latitude. We report the first use of geographically weighted regression for testing this hypothesis on a continuous species richness gradient that is entirely located within the tropics and subtropics. The dataset was divided into northern and southern hemispheric portions to test whether predictor shifts are more pronounced in the less oceanic northern hemisphere. American palms (Arecaceae, n = 547 spp., whose species richness and distributions are known to respond strongly to water and energy, were used as a model group. The ability of water and energy to explain palm species richness was quantified locally at different spatial scales and regressed on latitude. Clear latitudinal trends in agreement with water-energy dynamics theory were found, but the results did not differ qualitatively between hemispheres. Strong inherent spatial autocorrelation in local modeling results and collinearity of water and energy variables were identified as important methodological challenges. We overcame these problems by using simultaneous autoregressive models and variation partitioning. Our results show that the ability of water and energy to explain species richness changes not only across large climatic gradients spanning tropical to temperate or arctic zones but also within megathermal climates, at least for strictly tropical taxa such as palms. This finding suggests that the predictor shifts are related to gradual latitudinal changes in ambient energy (related to solar flux input rather than to abrupt transitions at specific latitudes, such as the occurrence of frost.
International Nuclear Information System (INIS)
Klaassen, Ger; Nentjes, Andries; Smith, Mark
2005-01-01
Simulation models and theory prove that emission trading converges to market equilibrium. This paper sets out to test these results using experimental economics. Three experiments are conducted for the six largest carbon emitting industrialized regions. Two experiments use auctions, the first a single bid auction and the second a Walrasian auction. The third relies on bilateral, sequential trading. The paper finds that, in line with the standard theory, both auctions and bilateral, sequential trading capture a significant part (88% to 99%) of the potential cost savings of emission trading. As expected from trade theory, all experiments show that the market price converges (although not fully) to the market equilibrium price. In contrast to the theory, the results also suggest that not every country might gain from trading. In both the bilateral trading experiment and the Walrasian auction, one country actually is worse off with trade. In particular bilateral, sequential trading leads to a distribution of gains significantly different from the competitive market outcome. This is due to speculative behavior, imperfect foresight and market power
Violato, Claudio; Gao, Hong; O'Brien, Mary Claire; Grier, David; Shen, E.
2018-01-01
The distinction between basic sciences and clinical knowledge which has led to a theoretical debate on how medical expertise is developed has implications for medical school and lifelong medical education. This longitudinal, population based observational study was conducted to test the fit of three theories--knowledge encapsulation, independent…
Turesson, Martin; Szparaga, Ryan; Ma, Ke; Woodward, Clifford E; Forsman, Jan
2014-05-14
A new classical density functional approach is developed to accurately treat a coarse-grained model of room temperature aromatic ionic liquids. Our major innovation is the introduction of charge-charge correlations, which are treated in a simple phenomenological way. We test this theory on a generic coarse-grained model for aromatic RTILs with oligomeric forms for both cations and anions, approximating 1-alkyl-3-methyl imidazoliums and BF₄⁻, respectively. We find that predictions by the new density functional theory for fluid structures at charged surfaces are very accurate, as compared with molecular dynamics simulations, across a range of surface charge densities and lengths of the alkyl chain. Predictions of interactions between charged surfaces are also presented.
Two-matrix models and c =1 string theory
International Nuclear Information System (INIS)
Bonora, L.; Xiong Chuansheng
1994-05-01
We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)
Modeling in applied sciences a kinetic theory approach
Pulvirenti, Mario
2000-01-01
Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...
Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory
International Nuclear Information System (INIS)
Chung, S.; Tye, S.H.
1993-01-01
The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory
Toward a General Research Process for Using Dubin's Theory Building Model
Holton, Elwood F.; Lowe, Janis S.
2007-01-01
Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…
Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel
2017-06-15
Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.
Testing linear growth rate formulas of non-scale endogenous growth models
Ziesemer, Thomas
2017-01-01
Endogenous growth theory has produced formulas for steady-state growth rates of income per capita which are linear in the growth rate of the population. Depending on the details of the models, slopes and intercepts are positive, zero or negative. Empirical tests have taken over the assumption of
CERN. Geneva
2002-01-01
A theory with such mathematical beauty cannot be wrong: this is one of the main arguments in favour of string theory, which unifies all known physical theories of fundamental interactions in a single coherent description of the universe. But no one has ever observed strings, not even indirectly, nor the space of extra dimensions where they live. However there are good reasons to believe that the 'hidden' dimensions of string theory may be much larger than what we thought in the past and that they may be within experimental reach in the near future - together with the strings themselves. In my talk, I will give an elementary introduction of string theory and describe the main experimental predictions.Organiser(s): Jasper Kirkby / EP DivisionNote: Tea & coffee will be served at 16.00 hrs.
Crisis in Context Theory: An Ecological Model
Myer, Rick A.; Moore, Holly B.
2006-01-01
This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…
Consensual decision-making model based on game theory for LNG processes
International Nuclear Information System (INIS)
Castillo, Luis; Dorao, Carlos A.
2012-01-01
Highlights: ► A Decision Making (DM) approach for LNG projects based on game theory is presented. ► DM framework was tested with two different cases, using analytical models and a simple LNG process. ► The problems were solved by using a Genetic Algorithm (GA) binary coding and Nash-GA. ► Integrated models from the design and optimization of the process could result in more realistic outcome. ► The major challenge in such a framework is related to the uncertainties in the market models. - Abstract: Decision-Making (DM) in LNG projects is a quite complex process due to the number of actors, approval phases, large investments and capital return in the long time. Furthermore, due to the very high investment of a LNG project, a detailed and efficient DM process is required in order to minimize risks. In this work a Decision-Making (DM) approach for LNG projects is presented. The approach is based on a consensus algorithm to address the consensus output over a common value using cost functions within a framework based on game theory. The DM framework was tested with two different cases. The first case was used for evaluating the performance of the framework with analytical models, while the second case corresponds to a simple LNG process. The problems were solved by using a Genetic Algorithm (GA) binary coding and Nash-GA. The results of the DM framework in the LNG project indicate that considering an integrated DM model and including the markets role from the design and optimization of the process more realistic outcome could be obtained. However, the major challenge in such a framework is related to the uncertainties in the market models.
Chern-Simons Theory, Matrix Models, and Topological Strings
International Nuclear Information System (INIS)
Walcher, J
2006-01-01
This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may
Feeny, David; Eng, Ken
2005-01-01
Prospect theory (PT) hypothesizes that people judge states relative to a reference point, usually assumed to be their current health. States better than the reference point are valued on a concave portion of the utility function; worse states are valued on a convex portion. Using prospectively collected utility scores, the objective is to test empirically implications of PT. Osteoarthritis (OA) patients undergoing total hip arthroplasty periodically provided standard gamble scores for three OA hypothetical states describing mild, moderate, and severe OA as well as their subjectively defined current state (SDCS). Our hypothesis was that most patients improved between the pre- and postsurgery assessments. According to PT, scores for hypothetical states previously > SDCS but now < SDCS should be lower at the postsurgery assessment. Fourteen patients met the criteria for testing the hypothesis. Predictions were confirmed for 0 patients; there was no change or mixed results for 6 patients (42.9 percent); and scores moved in the direction opposite to that predicted by PT for 8 patients (57.1 percent). In general, the direction and magnitude of the changes in hypothetical-state scores do not conform to the predictions of PT.
P value and the theory of hypothesis testing: an explanation for new researchers.
Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël
2010-03-01
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D
2016-08-01
Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.
Introduction to conformal field theory. With applications to string theory
International Nuclear Information System (INIS)
Blumenhagen, Ralph; Plauschinn, Erik
2009-01-01
Based on class-tested notes, this text offers an introduction to Conformal Field Theory with a special emphasis on computational techniques of relevance for String Theory. It introduces Conformal Field Theory at a basic level, Kac-Moody algebras, one-loop partition functions, Superconformal Field Theories, Gepner Models and Boundary Conformal Field Theory. Eventually, the concept of orientifold constructions is explained in detail for the example of the bosonic string. In providing many detailed CFT calculations, this book is ideal for students and scientists intending to become acquainted with CFT techniques relevant for string theory but also for students and non-specialists from related fields. (orig.)
Basen-Engquist, Karen; Carmack, Cindy L; Perkins, Heidi; Hughes, Daniel; Serice, Susan; Scruggs, Stacie; Pinto, Bernardine; Waters, Andrew
2011-01-01
Physical activity has been shown to benefit cancer survivors' physical functioning, emotional well-being, and symptoms. Physical activity may be of particular benefit to survivors of endometrial cancer because they are more likely to be obese and sedentary than the general population, as these are risk factors for the disease, and thus experience a number of related co-morbid health problems. However, there is little research systematically studying mechanisms of physical activity adherence in cancer survivor populations. This paper describes the design of the Steps to Health study, which applies a Social Cognitive Theory-based model of endometrial cancer survivors' adoption and maintenance of exercise in the context of an intervention to increase walking or other moderate intensity cardiovascular activity. In Steps to Health we will test the influence of self-efficacy and outcome expectations on adherence to exercise recommendations, as well as studying the determinants of self-efficacy. Endometrial cancer survivors who are at least 6 months post-treatment are provided with an intervention involving print materials and telephone counseling, and complete assessments of fitness, activity, self-efficacy and outcome expectations, and determinants of self-efficacy every two months for a six month period. In addition to testing an innovative model, the Steps to Health study employs multiple assessment methods, including ecological momentary assessment, implicit tests of cognitive variables, and ambulatory monitoring of physical activity. The study results can be used to develop more effective interventions for increasing physical activity in sedentary cancer survivors by taking into account the full complement of sources of self-efficacy information and outcome expectations.
Measuring and modeling salience with the theory of visual attention.
Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid
2017-08-01
For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.
International Nuclear Information System (INIS)
Navratil, P.; Dobes, J.
1992-01-01
Methods of boson mapping are tested in calculations for a simple model system of four protons and four neutrons in single-j distinguishable orbits. Two-body terms in the boson images of the fermion operators are considered. Effects of the seniority v=4 states are thus included. The treatment of unphysical states and the influence of boson space truncation are particularly studied. Both the Dyson boson mapping and the seniority boson mapping as dictated by the similarity transformed Dyson mapping do not seem to be simply amenable to truncation. This situation improves when the one-body form of the seniority image of the quadrupole operator is employed. Truncation of the boson space is addressed by using the effective operator theory with a notable improvement of results
Money Buys Financial Security and Psychological Need Satisfaction: Testing Need Theory in Affluence
Howell, Ryan T.; Kurai, Mark; Tam, Leona
2013-01-01
The most prominent theory to explain the curvilinear relationship between income and subjective well-being (SWB) is need theory, which proposes that increased income and wealth can lead to increased well-being in poverty because money is used to satisfy basic physiological needs. The present study tests the tenets of need theory by proposing that…
Automated Physico-Chemical Cell Model Development through Information Theory
Energy Technology Data Exchange (ETDEWEB)
Peter J. Ortoleva
2005-11-29
The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.
Cognitive-behavioural theories of helplessness/hopelessness: valid models of depression?
Henkel, V; Bussfeld, P; Möller, H-J; Hegerl, U
2002-10-01
Helplessness and hopelessness are central aspects of cognitive-behavioural explanations for the development and persistence of depression. In this article a general overview concerning the evolution of those approaches to depression is provided. Included is a critical examination of the theories. The review of the literature suggests that those cognitive models describing helplessness/hopelessness as trait factors mediating depression do not really have a strong empirical base. The majority of those studies had been conducted in healthy or only mildly depressed subjects. Thus, there seems to be little justification for broad generalisations beyond the populations studied. It seems that some of the reported studies have not tested the underlying theories adequately (e. g. correlation had sometimes been interpreted as causation; adequate prospective longitudinal study designs had seldom been applied). Moreover, the theoretical models are not generally prepared to explain all depressive features (e. g. the possibility of a spontaneous shift in a manic episode). Despite those limitations, there is a relevant impact of the learned helplessness paradigm on preclinical research in neurobiological correlates of depressive states. Last but not least, the models are of high interest with respect to the theoretical background of important modules of cognitive-behavioural therapy and its acute and prophylactic effects.
Tests of Cumulative Prospect Theory with graphical displays of probability
Directory of Open Access Journals (Sweden)
Michael H. Birnbaum
2008-10-01
Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.
Applications of decision theory to test-based decision making
van der Linden, Willem J.
1987-01-01
The use of Bayesian decision theory to solve problems in test-based decision making is discussed. Four basic decision problems are distinguished: (1) selection; (2) mastery; (3) placement; and (4) classification, the situation where each treatment has its own criterion. Each type of decision can be
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....
Cosmological consistency tests of gravity theory and cosmic acceleration
Ishak-Boushaki, Mustapha B.
2017-01-01
Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among the important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use statistical measures, the rate of cosmic expansion, the growth rate of large scale structure, and the physical consistency of these probes with one another.
Optimal and Most Exact Confidence Intervals for Person Parameters in Item Response Theory Models
Doebler, Anna; Doebler, Philipp; Holling, Heinz
2013-01-01
The common way to calculate confidence intervals for item response theory models is to assume that the standardized maximum likelihood estimator for the person parameter [theta] is normally distributed. However, this approximation is often inadequate for short and medium test lengths. As a result, the coverage probabilities fall below the given…
Thomas, Michael L
2012-03-01
There is growing evidence that psychiatric disorders maintain hierarchical associations where general and domain-specific factors play prominent roles (see D. Watson, 2005). Standard, unidimensional measurement models can fail to capture the meaningful nuances of such complex latent variable structures. The present study examined the ability of the multidimensional item response theory bifactor model (see R. D. Gibbons & D. R. Hedeker, 1992) to improve construct validity by serving as a bridge between measurement and clinical theories. Archival data consisting of 688 outpatients' psychiatric diagnoses and item-level responses to the Brief Symptom Inventory (BSI; L. R. Derogatis, 1993) were extracted from files at a university mental health clinic. The bifactor model demonstrated superior fit for the internal structure of the BSI and improved overall diagnostic accuracy in the sample (73%) compared with unidimensional (61%) and oblique simple structure (65%) models. Consistent with clinical theory, multiple sources of item variance were drawn from individual test items. Test developers and clinical researchers are encouraged to consider model-based measurement in the assessment of psychiatric distress.
Noar, Seth M; Myrick, Jessica Gall; Zeitany, Alexandra; Kelley, Dannielle; Morales-Pico, Brenda; Thomas, Nancy E
2015-01-01
The lack of a theory-based understanding of indoor tanning is a major impediment to the development of effective messages to prevent or reduce this behavior. This study applied the Comprehensive Indoor Tanning Expectations (CITE) scale in an analysis of indoor tanning behavior among sorority women (total N = 775). Confirmatory factor analyses indicated that CITE positive and negative expectations were robust, multidimensional factors and that a hierarchical structure fit the data well. Social cognitive theory-based structural equation models demonstrated that appearance-oriented variables were significantly associated with outcome expectations. Outcome expectations were, in turn, significantly associated with temptations to tan, intention to tan indoors, and indoor tanning behavior. The implications of these findings for the development of messages to prevent and reduce indoor tanning behavior are discussed in two domains: (a) messages that attempt to change broader societal perceptions about tan skin, and (b) messages that focus more narrowly on indoor tanning-challenging positive expectations, enhancing negative expectations, and encouraging substitution of sunless tanning products.
Nanoscale Capillary Flows in Alumina: Testing the Limits of Classical Theory.
Lei, Wenwen; McKenzie, David R
2016-07-21
Anodic aluminum oxide (AAO) membranes have well-formed cylindrical channels, as small as 10 nm in diameter, in a close packed hexagonal array. The channels in AAO membranes simulate very small leaks that may be present for example in an aluminum oxide device encapsulation. The 10 nm alumina channel is the smallest that has been studied to date for its moisture flow properties and provides a stringent test of classical capillary theory. We measure the rate at which moisture penetrates channels with diameters in the range of 10 to 120 nm with moist air present at 1 atm on one side and dry air at the same total pressure on the other. We extend classical theory for water leak rates at high humidities by allowing for variable meniscus curvature at the entrance and show that the extended theory explains why the flow increases greatly when capillary filling occurs and enables the contact angle to be determined. At low humidities our measurements for air-filled channels agree well with theory for the interdiffusive flow of water vapor in air. The flow rate of water-filled channels is one order of magnitude less than expected from classical capillary filling theory and is coincidentally equal to the helium flow rate, validating the use of helium leak testing for evaluating moisture flows in aluminum oxide leaks.
Test of virtual photon theory. [Cross sections, square-well potential, 0 to 300 MeV
Energy Technology Data Exchange (ETDEWEB)
Dressler, E T; Tomusiak, E L [Saskatchewan Univ., Saskatoon (Canada). Saskatchewan Accelerator Lab.
1976-11-30
In order to extract a photodisintegration cross section in the c.m. system from an electrodisintegration cross section measured in the lab system, one has to assume that the electrons are mostly scattered forward and that the monopole contributions are negligible. To test the validity of these approximations, a square well potential was assumed for the initial and final states and the photodisintegration cross section was calculated exactly within this model. These results were then compared with the results one would obtain using the virtual photon theory approximations for electron energies of 0-300 MeV and excitation energies up to 40 MeV. In comparing the two results, it is shown how and why the virtual photon theory approximations fail in certain kinematical regions.
Radial distributions of arm-gas offsets as an observational test of spiral theories
Baba, Junichi; Morokuma-Matsui, Kana; Egusa, Fumi
2015-01-01
Theories of stellar spiral arms in disk galaxies can be grouped into two classes based on the longevity of a spiral arm. Although the quasi-stationary density wave theory supposes that spirals are rigidly-rotating, long-lived patterns, the dynamic spiral theory predicts that spirals are differentially-rotating, transient, recurrent patterns. In order to distinguish between the two spiral models from observations, we performed hydrodynamic simulations with steady and dynamic spiral models. Hyd...
Matthews, Russell A; Wayne, Julie Holliday; Ford, Michael T
2014-11-01
In the present study, we examine competing predictions of stress reaction models and adaptation theories regarding the longitudinal relationship between work-family conflict and subjective well-being. Based on data from 432 participants over 3 time points with 2 lags of varying lengths (i.e., 1 month, 6 months), our findings suggest that in the short term, consistent with prior theory and research, work-family conflict is associated with poorer subjective well-being. Counter to traditional work-family predictions but consistent with adaptation theories, after accounting for concurrent levels of work-family conflict as well as past levels of subjective well-being, past exposure to work-family conflict was associated with higher levels of subjective well-being over time. Moreover, evidence was found for reverse causation in that greater subjective well-being at 1 point in time was associated with reduced work-family conflict at a subsequent point in time. Finally, the pattern of results did not vary as a function of using different temporal lags. We discuss the theoretical, research, and practical implications of our findings. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
The cross-national pattern of happiness. Test of predictions implied in three theories of happiness
R. Veenhoven (Ruut); J.J. Ehrhardt (Joop)
1995-01-01
textabstractABSTRACT. Predictions about level and dispersion of happiness in nations are derived from three theories of happiness: comparison-theory, folklore-theory and livability-theory. The predictions are tested on two cross national data-sets: a comparative survey among university students in
The Standard-Model Extension and Gravitational Tests
Directory of Open Access Journals (Sweden)
Jay D. Tasson
2016-10-01
Full Text Available The Standard-Model Extension (SME provides a comprehensive effective field-theory framework for the study of CPT and Lorentz symmetry. This work reviews the structure and philosophy of the SME and provides some intuitive examples of symmetry violation. The results of recent gravitational tests performed within the SME are summarized including analysis of results from the Laser Interferometer Gravitational-Wave Observatory (LIGO, sensitivities achieved in short-range gravity experiments, constraints from cosmic-ray data, and results achieved by studying planetary ephemerids. Some proposals and ongoing efforts will also be considered including gravimeter tests, tests of the Weak Equivalence Principle, and antimatter experiments. Our review of the above topics is augmented by several original extensions of the relevant work. We present new examples of symmetry violation in the SME and use the cosmic-ray analysis to place first-ever constraints on 81 additional operators.
A Leadership Identity Development Model: Applications from a Grounded Theory
Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.
2006-01-01
This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…
A review of organizational buyer behaviour models and theories ...
African Journals Online (AJOL)
Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...
Integrable theories that are asymptotically CFT
Evans, J M; Jonathan M Evans; Timothy J Hollowood
1995-01-01
A series of sigma models with torsion are analysed which generate their mass dynamically but whose ultra-violet fixed points are non-trivial conformal field theories -- in fact SU(2) WZW models at level k. In contrast to the more familiar situation of asymptotically free theories in which the fixed points are trivial, the sigma models considered here may be termed ``asymptotically CFT''. These theories have previously been conjectured to be quantum integrable; we confirm this by proposing a factorizable S-matrix to describe their infra-red behaviour and then carrying out a stringent test of this proposal. The test involves coupling the theory to a conserved charge and evaluating the response of the free-energy both in perturbation theory to one loop and directly from the S-matrix via the Thermodynamic Bethe Ansatz with a chemical potential at zero temperature. Comparison of these results provides convincing evidence in favour of the proposed S-matrix; it also yields the universal coefficients of the beta-func...
Latent factor modeling of four schizotypy dimensions with theory of mind and empathy.
Directory of Open Access Journals (Sweden)
Jeffrey S Bedwell
Full Text Available Preliminary evidence suggests that theory of mind and empathy relate differentially to factors of schizotypy. The current study assessed 686 undergraduate students and used structural equation modeling to examine links between a four-factor model of schizotypy with performance on measures of theory of mind (Reading the Mind in the Eyes Test [MIE] and empathy (Interpersonal Reactivity Index [IRI]. Schizotypy was assessed using three self-report measures which were simultaneously entered into the model. Results revealed that the Negative factor of schizotypy showed a negative relationship with the Empathy factor, which was primarily driven by the Empathic Concern subscale of the IRI and the No Close Friends and Constricted Affect subscales of the Schizotypal Personality Questionnaire. These findings are consistent with a growing body of literature suggesting a relatively specific relationship between negative schizotypy and empathy, and are consistent with several previous studies that found no relationship between MIE performance and schizotypy.
Chen, Zhenhua; Hoffmann, Mark R
2012-07-07
A unitary wave operator, exp (G), G(+) = -G, is considered to transform a multiconfigurational reference wave function Φ to the potentially exact, within basis set limit, wave function Ψ = exp (G)Φ. To obtain a useful approximation, the Hausdorff expansion of the similarity transformed effective Hamiltonian, exp (-G)Hexp (G), is truncated at second order and the excitation manifold is limited; an additional separate perturbation approximation can also be made. In the perturbation approximation, which we refer to as multireference unitary second-order perturbation theory (MRUPT2), the Hamiltonian operator in the highest order commutator is approximated by a Mo̸ller-Plesset-type one-body zero-order Hamiltonian. If a complete active space self-consistent field wave function is used as reference, then the energy is invariant under orbital rotations within the inactive, active, and virtual orbital subspaces for both the second-order unitary coupled cluster method and its perturbative approximation. Furthermore, the redundancies of the excitation operators are addressed in a novel way, which is potentially more efficient compared to the usual full diagonalization of the metric of the excited configurations. Despite the loss of rigorous size-extensivity possibly due to the use of a variational approach rather than a projective one in the solution of the amplitudes, test calculations show that the size-extensivity errors are very small. Compared to other internally contracted multireference perturbation theories, MRUPT2 only needs reduced density matrices up to three-body even with a non-complete active space reference wave function when two-body excitations within the active orbital subspace are involved in the wave operator, exp (G). Both the coupled cluster and perturbation theory variants are amenable to large, incomplete model spaces. Applications to some widely studied model systems that can be problematic because of geometry dependent quasidegeneracy, H4, P4
Topos models for physics and topos theory
International Nuclear Information System (INIS)
Wolters, Sander
2014-01-01
What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos
Finite Unification: Theory, Models and Predictions
Heinemeyer, S; Zoupanos, G
2011-01-01
All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...
Scattering and short-distance properties in field theory models
International Nuclear Information System (INIS)
Iagolnitzer, D.
1987-01-01
The aim of constructive field theory is not only to define models but also to establish their general properties of physical interest. We here review recent works on scattering and on short-distance properties for weakly coupled theories with mass gap such as typically P(φ) in dimension 2, φ 4 in dimension 3 and the (renormalizable, asymptotically free) massive Gross-Neveu (GN) model in dimension 2. Many of the ideas would apply similarly to other (possibly non renormalizable) theories that might be defined in a similar way via phase-space analysis
Testing Self-Determination Theory via Nigerian and Indian Adolescents
Sheldon, Kennon M.; Abad, Neetu; Omoile, Jessica
2009-01-01
We tested the generalizability of five propositions derived from Self-Determination Theory (SDT; Deci & Ryan, 2000) using school-aged adolescents living in India (N = 926) and Nigeria (N = 363). Consistent with past U.S. research, perceived teacher autonomy-support predicted students' basic need-satisfaction in the classroom and also predicted…
Adversarial life testing: A Bayesian negotiation model
International Nuclear Information System (INIS)
Rufo, M.J.; Martín, J.; Pérez, C.J.
2014-01-01
Life testing is a procedure intended for facilitating the process of making decisions in the context of industrial reliability. On the other hand, negotiation is a process of making joint decisions that has one of its main foundations in decision theory. A Bayesian sequential model of negotiation in the context of adversarial life testing is proposed. This model considers a general setting for which a manufacturer offers a product batch to a consumer. It is assumed that the reliability of the product is measured in terms of its lifetime. Furthermore, both the manufacturer and the consumer have to use their own information with respect to the quality of the product. Under these assumptions, two situations can be analyzed. For both of them, the main aim is to accept or reject the product batch based on the product reliability. This topic is related to a reliability demonstration problem. The procedure is applied to a class of distributions that belong to the exponential family. Thus, a unified framework addressing the main topics in the considered Bayesian model is presented. An illustrative example shows that the proposed technique can be easily applied in practice
Foundations of compositional model theory
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim
2011-01-01
Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf
A comment on a proposed ''crucial experiment'' to test Einstein's special theory of relativity
International Nuclear Information System (INIS)
Rodrigues Jr, W.A.; Buonamano, V.
1976-01-01
A proposed ''crucial experiment'' to test Einstein's special theory of relativity is analysed and it is shown that it falls into the set of unsatisfactory proposals that attempt to make an experimental distinction between Einstein's special theory of relativity and a ''Lorentzian type'' special theory of relativity
Memory for performance feedback :a test of three self- motivation theories
Donlin, Joanne Mac
1990-01-01
The current study tests the adequacy of three self-motive theories to predict recall of performance feedback, memory sensitivity, and ratings of perceived accuracy. Self-enhancement (Jones, 1973) predicts individuals are motivated to maintain their self-esteem. Individuals will therefore recall positive relative to negative feedback and will rate positive feedback as more accurate. Self-consistency theory (Swann, 1985) predicts individuals are motivated to maintain their self-conceptions. The...
Robust global identifiability theory using potentials--Application to compartmental models.
Wongvanich, N; Hann, C E; Sirisena, H R
2015-04-01
This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.
A modified Lorentz theory as a test theory of special relativity
Chang, T.; Torr, D. G.; Gagnon, D. R.
1988-01-01
Attention has been given recently to a modified Lorentz theory (MLT) that is based on the generalized Galilean transformation. Some explicit formulas within the framework of MLT, dealing with the one-way velocity of light, slow-clock transport, and the Doppler effect are derived. A number of typical experiments are analyzed on this basis. Results indicate that the empirical equivalence between MLT and special relativity is still maintained to second order terms. The results of previous works that predict that the MLT might be distinguished from special relativity at the third order by Doppler centrifuge tests capable of a fractional frequency detection threshold of 10 to the -15th are confirmed.
A cluster randomized theory-guided oral hygiene trial in adolescents-A latent growth model.
Aleksejūnienė, J; Brukienė, V
2018-05-01
(i) To test whether theory-guided interventions are more effective than conventional dental instruction (CDI) for changing oral hygiene in adolescents and (ii) to examine whether such interventions equally benefit both genders and different socio-economic (SES) groups. A total of 244 adolescents were recruited from three schools, and cluster randomization allocated adolescents to one of the three types of interventions: two were theory-based interventions (Precaution Adoption Process Model or Authoritative Parenting Model) and CDI served as an active control. Oral hygiene levels % (OH) were assessed at baseline, after 3 months and after 12 months. A complete data set was available for 166 adolescents (the total follow-up rate: 69%). There were no significant differences in baseline OH between those who participated throughout the study and those who dropped out. Bivariate and multivariate analyses showed that theory-guided interventions produced significant improvements in oral hygiene and that there were no significant gender or socio-economic differences. Theory-guided interventions produced more positive changes in OH than CDI, and these changes did not differ between gender and SES groups. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Directory of Open Access Journals (Sweden)
Kotapati Srinivasa Reddy
2015-12-01
Full Text Available The extant social sciences and management theoretical concepts and empirical literature have mostly determined based on western (developed economies institutional context. In the recent past, a number of researchers have argued that the western theories are inadequate to study the emerging markets phenomenon and described the problems relating to data collection, data analysis, and theory development. I also (experience confirm that major problems are relating to the research data collection, especially primary data (interview and survey methods. With this in mind, I develop a new case study research design, that is, “Test-Tube” typology, to build theory from emerging markets behavior as well as to add new knowledge to the mass of disciplines, particularly social sciences, medicine, travel, tourism and hospitality, sports, management, and information systems, and engineering. I design a typology that consists of eleven steps: case development, case selection, relatedness and pattern matching, case analysis, cross-case analysis, theoretical constructs, pre-testing and development, adjusting theoretical constructs, theory testing, building theory and testable propositions, and suggesting strategic swap model. Further, I suggest a set of few guidelines on how to measure the research quality and how to strengthen the research rigor in case study settings.
Theories of conduct disorder: a causal modelling analysis
Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De
2004-01-01
Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –
Modelling non-ignorable missing data mechanisms with item response theory models
Holman, Rebecca; Glas, Cornelis A.W.
2005-01-01
A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled
Modelling non-ignorable missing-data mechanisms with item response theory models
Holman, Rebecca; Glas, Cees A. W.
2005-01-01
A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled
Nematic elastomers: from a microscopic model to macroscopic elasticity theory.
Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette
2008-05-01
A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.
Spatial interaction models facility location using game theory
D'Amato, Egidio; Pardalos, Panos
2017-01-01
Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.
Solid modeling and applications rapid prototyping, CAD and CAE theory
Um, Dugan
2016-01-01
The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...
An anthology of theories and models of design philosophy, approaches and empirical explorations
Blessing, Lucienne
2014-01-01
While investigations into both theories and models has remained a major strand of engineering design research, current literature sorely lacks a reference book that provides a comprehensive and up-to-date anthology of theories and models, and their philosophical and empirical underpinnings; An Anthology of Theories and Models of Design fills this gap. The text collects the expert views of an international authorship, covering: · significant theories in engineering design, including CK theory, domain theory, and the theory of technical systems; · current models of design, from a function behavior structure model to an integrated model; · important empirical research findings from studies into design; and · philosophical underpinnings of design itself. For educators and researchers in engineering design, An Anthology of Theories and Models of Design gives access to in-depth coverage of theoretical and empirical developments in this area; for pr...
Cosmological viability of the bimetric theory of gravitation
International Nuclear Information System (INIS)
Krygier, B.; Krempec-Krygier, J.
1983-01-01
The approximate solutions of field equations for flat radiative cosmological models in the second version of bimetric gravitation theory are discussed. They indicate that these cosmological models are ever expanding. The apparent magnitude-redshift relations for flat dust cosmological models for different theories of gravitation are described and compared. One can reject Dirac's additive creation theory and the first version of Rosen's bimetric theory on the basis of this observational test. (author)
The Equal Pay Act as an Experiment to Test Theories of the Labour Market.
Manning, Alan
1996-01-01
The UK Equal Pay Act of 1970 resulted in a large rise in the relative earnings of women in the early 1970s. As this change (unlike most wage changes) was largely exogenous to employers, one can think of this episode as an experiment for testing different theories of the labour market. Hence, study of the effects of the Equal Pay Act should be given considerable weight and is likely to have wider implications about the operation of labour markets. Most models of the labour market used by econo...
Henningsen, David Dryden; Henningsen, Mary Lynn Miller
2010-01-01
Research on error management theory indicates that men tend to overestimate women's sexual interest and women underestimate men's interest in committed relationships (Haselton & Buss, 2000). We test the assumptions of the theory in face-to-face, stranger interactions with 111 man-woman dyads. Support for the theory emerges, but potential boundary…
Edwards, Katie M; Gidycz, Christine A; Murphy, Megan J
2015-10-01
The purpose of the current study was to build on the existing literature to better understand young women's leaving processes in abusive dating relationships using a prospective design. Two social psychological models-the investment model and theory of planned behavior-were tested. According to the investment model, relationship continuation is predicted by commitment, which is a function of investment, satisfaction, and low quality of alternatives. The theory of planned behavior asserts that a specific behavior is predicted by an individual's intention to use a behavior, which is a function of the individual's attitudes toward the behavior, the subjective norms toward the behavior, and the individual's perceived behavioral control over the behavior. College women (N = 169 young women in abusive relatinships) completed surveys at two time points, approximately 4 months apart, to assess initially for the presence of intimate partner violence (IPV) in a current relationship and investment model and theory of planned behavior variables; the purpose of the 4-month follow-up session was to determine if women had remained in or terminated their abusive relationship. Path analytic results demonstrated that both the theory of planned behavior and investment models were good fits to the data in prospectively predicting abused women's stay/leave decisions. However, the theory of planned behavior was a better fit to the data than the investment model. Implications for future research and intervention are discussed. © The Author(s) 2014.
An experimental test of the linear no-threshold theory of radiation carcinogenesis
International Nuclear Information System (INIS)
Cohen, B.L.
1990-01-01
There is a substantial body of quantitative information on radiation-induced cancer at high dose, but there are no data at low dose. The usual method for estimating effects of low-level radiation is to assume a linear no-threshold dependence. if this linear no-threshold assumption were not used, essentially all fears about radiation would disappear. Since these fears are costing tens of billions of dollars, it is most important that the linear no-threshold theory be tested at low dose. An opportunity for possibly testing the linear no-threshold concept is now available at low dose due to radon in homes. The purpose of this paper is to attempt to use this data to test the linear no-threshold theory
Health belief model and reasoned action theory in predicting water saving behaviors in yazd, iran.
Morowatisharifabad, Mohammad Ali; Momayyezi, Mahdieh; Ghaneian, Mohammad Taghi
2012-01-01
People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter¬mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha¬viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta¬tistically positive correlation between water saving behaviors and intention. In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors.
Health Belief Model and Reasoned Action Theory in Predicting Water Saving Behaviors in Yazd, Iran
Directory of Open Access Journals (Sweden)
Mohammad Taghi Ghaneian
2012-12-01
Full Text Available Background: People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter-mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha-viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. Methods: The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Results: Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta-tistically positive correlation between water saving behaviors and intention. Conclusion: In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors.
2004-2005 Academic Training Programme: Electroweak Theory and the Standard Model
Françoise Benz
2004-01-01
6, 7, 8, 9 and 10 December LECTURE SERIES 6, 7, 8, 9, 10 December from 11:00 to 12:00 - Main Auditorium, bldg. 500 on 6, 7, 8, 10 December, TH Auditorium, bldg. 4 3-006 on 9 December Electroweak Theory and the Standard Model R. BARBIERI / CERN-PH-TH There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development /test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector. Transparencies available at: http://agenda.cern.ch/fullAgenda.php?ida=a042577 ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch If you wish to participate in one of the following courses, please discuss with your supervisor and apply electronically directly from the course description pages that can ...
2004-2005 Academic Training Programme: Electroweak Theory and the Standard Model
Françoise Benz
2004-01-01
6, 7, 8, 9 and 10 December LECTURE SERIES 6, 7, 8, 9, 10 December from 11:00 to 12:00 - Main Auditorium, bldg. 500 on 6, 7, 8, 10 December, TH Auditorium, bldg. 4 3-006 on 9 December Electroweak Theory and the Standard Model R. BARBIERI / CERN-PH-TH There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development /test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector. ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch Si vous désirez participer à l'un des cours suivants, veuillez en discuter avec votre superviseur et vous inscrire électroniquement en direct depuis les pages de description des cours dans le Web que vous trouvez &ag...
Spin foam model for pure gauge theory coupled to quantum gravity
International Nuclear Information System (INIS)
Oriti, Daniele; Pfeiffer, Hendryk
2002-01-01
We propose a spin foam model for pure gauge fields coupled to Riemannian quantum gravity in four dimensions. The model is formulated for the triangulation of a four-manifold which is given merely combinatorially. The Riemannian Barrett-Crane model provides the gravity sector of our model and dynamically assigns geometric data to the given combinatorial triangulation. The gauge theory sector is a lattice gauge theory living on the same triangulation and obtains from the gravity sector the geometric information which is required to calculate the Yang-Mills action. The model is designed so that one obtains a continuum approximation of the gauge theory sector at an effective level, similarly to the continuum limit of lattice gauge theory, when the typical length scale of gravity is much smaller than the Yang-Mills scale
A model of PCF in guarded type theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...
A Model of PCF in Guarded Type Theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...
Lehmann, Rüdiger; Lösler, Michael
2017-12-01
Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.
Energy Technology Data Exchange (ETDEWEB)
Walker-Loud, Andre [College of William and Mary, Williamsburg, VA (United States)
2016-10-14
The research supported by this grant is aimed at probing the limits of the Standard Model through precision low-energy nuclear physics. The work of the PI (AWL) and additional personnel is to provide theory input needed for a number of potentially high-impact experiments, notably, hadronic parity violation, Dark Matter direct detection and searches for permanent electric dipole moments (EDMs) in nucleons and nuclei. In all these examples, a quantitative understanding of low-energy nuclear physics from the fundamental theory of strong interactions, Quantum Chromo-Dynamics (QCD), is necessary to interpret the experimental results. The main theoretical tools used and developed in this work are the numerical solution to QCD known as lattice QCD (LQCD) and Effective Field Theory (EFT). This grant is supporting a new research program for the PI, and as such, needed to be developed from the ground up. Therefore, the first fiscal year of this grant, 08/01/2014-07/31/2015, has been spent predominantly establishing this new research effort. Very good progress has been made, although, at this time, there are not many publications to show for the effort. After one year, the PI accepted a job at Lawrence Berkeley National Laboratory, so this final report covers just a single year of five years of the grant.
Game Theory and its Relationship with Linear Programming Models ...
African Journals Online (AJOL)
Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.
sigma model approach to the heterotic string theory
International Nuclear Information System (INIS)
Sen, A.
1985-09-01
Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs
The evolution of sexes: A specific test of the disruptive selection theory.
da Silva, Jack
2018-01-01
The disruptive selection theory of the evolution of anisogamy posits that the evolution of a larger body or greater organismal complexity selects for a larger zygote, which in turn selects for larger gametes. This may provide the opportunity for one mating type to produce more numerous, small gametes, forcing the other mating type to produce fewer, large gametes. Predictions common to this and related theories have been partially upheld. Here, a prediction specific to the disruptive selection theory is derived from a previously published game-theoretic model that represents the most complete description of the theory. The prediction, that the ratio of macrogamete to microgamete size should be above three for anisogamous species, is supported for the volvocine algae. A fully population genetic implementation of the model, involving mutation, genetic drift, and selection, is used to verify the game-theoretic approach and accurately simulates the evolution of gamete sizes in anisogamous species. This model was extended to include a locus for gamete motility and shows that oogamy should evolve whenever there is costly motility. The classic twofold cost of sex may be derived from the fitness functions of these models, showing that this cost is ultimately due to genetic conflict.
A new free-surface stabilization algorithm for geodynamical modelling: Theory and numerical tests
Andrés-Martínez, Miguel; Morgan, Jason P.; Pérez-Gussinyé, Marta; Rüpke, Lars
2015-09-01
The surface of the solid Earth is effectively stress free in its subaerial portions, and hydrostatic beneath the oceans. Unfortunately, this type of boundary condition is difficult to treat computationally, and for computational convenience, numerical models have often used simpler approximations that do not involve a normal stress-loaded, shear-stress free top surface that is free to move. Viscous flow models with a computational free surface typically confront stability problems when the time step is bigger than the viscous relaxation time. The small time step required for stability (develop strategies that mitigate the stability problem by making larger (at least ∼10 Kyr) time steps stable and accurate. Here we present a new free-surface stabilization algorithm for finite element codes which solves the stability problem by adding to the Stokes formulation an intrinsic penalization term equivalent to a portion of the future load at the surface nodes. Our algorithm is straightforward to implement and can be used with both Eulerian or Lagrangian grids. It includes α and β parameters to respectively control both the vertical and the horizontal slope-dependent penalization terms, and uses Uzawa-like iterations to solve the resulting system at a cost comparable to a non-stress free surface formulation. Four tests were carried out in order to study the accuracy and the stability of the algorithm: (1) a decaying first-order sinusoidal topography test, (2) a decaying high-order sinusoidal topography test, (3) a Rayleigh-Taylor instability test, and (4) a steep-slope test. For these tests, we investigate which α and β parameters give the best results in terms of both accuracy and stability. We also compare the accuracy and the stability of our algorithm with a similar implicit approach recently developed by Kaus et al. (2010). We find that our algorithm is slightly more accurate and stable for steep slopes, and also conclude that, for longer time steps, the optimal
The logical foundations of scientific theories languages, structures, and models
Krause, Decio
2016-01-01
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...
Planar N = 4 gauge theory and the Hubbard model
International Nuclear Information System (INIS)
Rej, Adam; Serban, Didina; Staudacher, Matthias
2006-01-01
Recently it was established that a certain integrable long-range spin chain describes the dilatation operator of N = 4 gauge theory in the su(2) sector to at least three-loop order, while exhibiting BMN scaling to all orders in perturbation theory. Here we identify this spin chain as an approximation to an integrable short-ranged model of strongly correlated electrons: The Hubbard model
Testing neoclassical competitive market theory in the field.
List, John A
2002-11-26
This study presents results from a pilot field experiment that tests predictions of competitive market theory. A major advantage of this particular field experimental design is that my laboratory is the marketplace: subjects are engaged in buying, selling, and trading activities whether I run an exchange experiment or am a passive observer. In this sense, I am gathering data in a natural environment while still maintaining the necessary control to execute a clean comparison between treatments. The main results of the study fall into two categories. First, the competitive model predicts reasonably well in some market treatments: the expected price and quantity levels are approximated in many market rounds. Second, the data suggest that market composition is important: buyer and seller experience levels impact not only the distribution of rents but also the overall level of rents captured. An unexpected result in this regard is that average market efficiency is lowest in markets that match experienced buyers and experienced sellers and highest when experienced buyers engage in bargaining with inexperienced sellers. Together, these results suggest that both market experience and market composition play an important role in the equilibrium discovery process.
Models with oscillator terms in noncommutative quantum field theory
International Nuclear Information System (INIS)
Kronberger, E.
2010-01-01
The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de
Psychodynamic theory and counseling in predictive testing for Huntington's disease.
Tassicker, Roslyn J
2005-04-01
This paper revisits psychodynamic theory, which can be applied in predictive testing counseling for Huntington's Disease (HD). Psychodynamic theory has developed from the work of Freud and places importance on early parent-child experiences. The nature of these relationships, or attachments are reflected in adult expectations and relationships. Two significant concepts, identification and fear of abandonment, have been developed and expounded by the psychodynamic theorist, Melanie Klein. The processes of identification and fear of abandonment can become evident in predictive testing counseling and are colored by the client's experience of growing up with a parent affected by Huntington's Disease. In reflecting on family-of-origin experiences, clients can also express implied expectations of the future, and future relationships. Case examples are given to illustrate the dynamic processes of identification and fear of abandonment which may present in the clinical setting. Counselor recognition of these processes can illuminate and inform counseling practice.
Integrable lambda models and Chern-Simons theories
International Nuclear Information System (INIS)
Schmidtt, David M.
2017-01-01
In this note we reveal a connection between the phase space of lambda models on S 1 ×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 ×S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.
Directory of Open Access Journals (Sweden)
Dylan Molenaar
2015-08-01
Full Text Available In the psychometric literature, item response theory models have been proposed that explicitly take the decision process underlying the responses of subjects to psychometric test items into account. Application of these models is however hampered by the absence of general and flexible software to fit these models. In this paper, we present diffIRT, an R package that can be used to fit item response theory models that are based on a diffusion process. We discuss parameter estimation and model fit assessment, show the viability of the package in a simulation study, and illustrate the use of the package with two datasets pertaining to extraversion and mental rotation. In addition, we illustrate how the package can be used to fit the traditional diffusion model (as it has been originally developed in experimental psychology to data.
Renormalization-group theory for the eddy viscosity in subgrid modeling
Zhou, YE; Vahala, George; Hossain, Murshed
1988-01-01
Renormalization-group theory is applied to incompressible three-dimensional Navier-Stokes turbulence so as to eliminate unresolvable small scales. The renormalized Navier-Stokes equation now includes a triple nonlinearity with the eddy viscosity exhibiting a mild cusp behavior, in qualitative agreement with the test-field model results of Kraichnan. For the cusp behavior to arise, not only is the triple nonlinearity necessary but the effects of pressure must be incorporated in the triple term. The renormalized eddy viscosity will not exhibit a cusp behavior if it is assumed that a spectral gap exists between the large and small scales.
Sheldon, Kennon M; Schachtman, Todd R
2007-04-01
Schlenker's triangle model (Schlenker, Britt, Pennington, Murphy, & Doherty, 1994, Schlenker, Pontari, & Christopher, 2001) identifies three excuses people use to avoid taking responsibility after failure: that one had no control in the situation, that the obligation was unclear, and that it was not really one's obligation. Three retrospective studies tested the presumed negative association between excuse making and responsibility taking. The studies also examined the effects of self-determination theory's concept of motivational internalization (Deci & Ryan, 2000) upon these variables. A complex but replicable pattern emerged, such that responsibility taking and motivational internalization correlated with adaptive outcomes such as future commitment and positive expectancy and excuse making did not. Of particular interest, perceiving that the person levying the obligation internalized motivation predicted responsibility taking, in all three studies. Implications for the triangle model, as well as for theories of maturity and personality development, are considered.
Reed, Frances M; Fitzgerald, Les; Rae, Melanie
2016-01-01
To highlight philosophical and theoretical considerations for planning a mixed methods research design that can inform a practice model to guide rural district nursing end of life care. Conceptual models of nursing in the community are general and lack guidance for rural district nursing care. A combination of pragmatism and nurse agency theory can provide a framework for ethical considerations in mixed methods research in the private world of rural district end of life care. Reflection on experience gathered in a two-stage qualitative research phase, involving rural district nurses who use advocacy successfully, can inform a quantitative phase for testing and complementing the data. Ongoing data analysis and integration result in generalisable inferences to achieve the research objective. Mixed methods research that creatively combines philosophical and theoretical elements to guide design in the particular ethical situation of community end of life care can be used to explore an emerging field of interest and test the findings for evidence to guide quality nursing practice. Combining philosophy and nursing theory to guide mixed methods research design increases the opportunity for sound research outcomes that can inform a nursing model of care.
Braspenning, N.C.W.M.
2008-01-01
For manufacturers of high-tech multi-disciplinary systems such as semiconductor equipment, the effort required for integration and system testing is ever increasing, while customers demand a shorter time-to-market.This book describes how executable models can replace unavailable component
Earthquake likelihood model testing
Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.
2007-01-01
INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a
Longitudinal and Integrative Tests of Family Stress Model Effects on Mexican-Origin Adolescents
White, Rebecca M. B.; Liu, Yu; Nair, Rajni L.; Tein, Jenn-Yun
2015-01-01
The family stress model represents a common framework through which to examine the effects of environmental stressors on adolescent adjustment. The model suggests that economic and neighborhood stressors influence youth adjustment via disruptions to parenting. Incorporating integrative developmental theory, we examined the degree to which parents’ cultural value orientations mitigated the effects of stressors on parenting disruptions and the degree to which environmental adversity qualified the effect of parenting on adolescent adjustment. We tested the hypothesized Integrative Family Stress Model longitudinally in a sample of mother-youth dyads (N = 749) and father-youth dyads (N = 467) from Mexican origin families, across three times points spanning early to middle adolescence. Providing the first longitudinal evidence of family stress mediated effects, mothers’ perceptions of economic pressure were associated with increases in adolescent externalizing symptoms five years later via intermediate increases in harsh parenting. The remaining findings supported the notion that integrative developmental theory can inform family stress model hypothesis testing that is culturally and contextually relevant for wide range of diverse families and youth. For example, fathers’ perceptions of economic pressure and neighborhood danger had important implications for adolescent internalizing, via reductions in paternal warmth, but only at certain levels of neighborhood adversity. Mothers’ familism value orientations mitigated the effects of economic pressure on maternal warmth, protecting their adolescents from experiencing developmental costs associated with environmental stressors. Results are discussed in terms of identifying how integrative developmental theory intersects with the family stress model to set diverse youth on different developmental pathways. PMID:25751100
Stochastic quantization of field theories on the lattice and supersymmetrical models
International Nuclear Information System (INIS)
Aldazabal, Gerardo.
1984-01-01
Several aspects of the stochastic quantization method are considered. Specifically, field theories on the lattice and supersymmetrical models are studied. A non-linear sigma model is studied firstly, and it is shown that it is possible to obtain evolution equations written directly for invariant quantities. These ideas are generalized to obtain Langevin equations for the Wilson loops of non-abelian lattice gauge theories U (N) and SU (N). In order to write these equations, some different ways of introducing the constraints which the fields must satisfy are discussed. It is natural to have a strong coupling expansion in these equations. The correspondence with quantum field theory is established, and it is noticed that at all orders in the perturbation theory, Langevin equations reduce to Schwinger-Dyson equations. From another point of view, stochastic quantization is applied to large N matrix models on the lattice. As a result, a simple and systematic way of building reduced models is found. Referring to stochastic quantization in supersymmetric theories, a simple supersymmetric model is studied. It is shown that it is possible to write an evolution equation for the superfield wich leads to quantum field theory results in equilibrium. As the Langevin equation preserves supersymmetry, the property of dimensional reduction known for the quantum model is shown to be valid at all times. (M.E.L.) [es
Spin foam models of Yang-Mills theory coupled to gravity
International Nuclear Information System (INIS)
Mikovic, A
2003-01-01
We construct a spin foam model of Yang-Mills theory coupled to gravity by using a discretized path integral of the BF theory with polynomial interactions and the Barrett-Crane ansatz. In the Euclidean gravity case, we obtain a vertex amplitude which is determined by a vertex operator acting on a simple spin network function. The Euclidean gravity results can be straightforwardly extended to the Lorentzian case, so that we propose a Lorentzian spin foam model of Yang-Mills theory coupled to gravity
Testing the entrepreneurial intention model on a two-country sample
Liñán, Francisco; Chen, Yi-Wen
2006-01-01
This paper tests the Entrepreneurial Intention Model -which is adapted from the Theory of Planned Behavior- on a sample of 533 individuals from two quite different countries: one of them European (Spain) and the other South Asian (Taiwan). A newly developed Entrepreneurial Intention Questionnaire (EIQ) has being used which tries to overcome some of the limitations of previous instruments. Structural equations techniques were used in the empirical analysis. Results are generally...
Working memory: theories, models, and controversies.
Baddeley, Alan
2012-01-01
I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.
Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions
Directory of Open Access Journals (Sweden)
Camaren Peter
2014-03-01
Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.
Lippke, Sonia; Plotnikoff, Ronald C
2009-05-01
Two different theories of health behaviour have been chosen with the aim of theory integration: a continuous theory (protection motivation theory, PMT) and a stage model (transtheoretical model, TTM). This is the first study to test whether the stages of the TTM moderate the interrelation of PMT-variables and the mediation of motivation, as well as PMT-variables' interactions in predicting stage transitions. Hypotheses were tested regarding (1) mean patterns, stage pair-comparisons and nonlinear trends using ANOVAs; (2) prediction-patterns for the different stage groups employing multi-group structural equation modelling (MSEM) and nested model analyses; and (3) stage transitions using binary logistic regression analyses. Adults (N=1,602) were assessed over a 6 month period on their physical activity stages, PMT-variables and subsequent behaviour. (1) Particular mean differences and nonlinear trends in all test variables were found. (2) The PMT adequately fitted the five stage groups. The MSEM revealed that covariances within threat appraisal and coping appraisal were invariant and all other constrains were stage-specific, i.e. stage was a moderator. Except for self-efficacy, motivation fully mediated the relationship between the social-cognitive variables and behaviour. (3) Predicting stage transitions with the PMT-variables underscored the importance of self-efficacy. Only when threat appraisal and coping appraisal were high, stage movement was more likely in the preparation stage. Results emphasize stage-specific differences of the PMT mechanisms, and hence, support the stage construct. The findings may guide further theory building and research integrating different theoretical approaches.
Matrix models and stochastic growth in Donaldson-Thomas theory
Energy Technology Data Exchange (ETDEWEB)
Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)
2012-10-15
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Matrix models and stochastic growth in Donaldson-Thomas theory
International Nuclear Information System (INIS)
Szabo, Richard J.; Tierz, Miguel
2012-01-01
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Soliton excitations in polyacetylene and relativistic field theory models
International Nuclear Information System (INIS)
Campbell, D.K.; Bishop, A.R.; Los Alamos Scientific Lab., NM
1982-01-01
A continuum model of a Peierls-dimerized chain, as described generally by Brazovskii and discussed for the case of polyacetylene by Takayama, Lin-Liu and Maki (TLM), is considered. The continuum (Bogliubov-de Gennes) equations arising in this model of interacting electrons and phonons are shown to be equivalent to the static, semiclassical equations for a solvable model field theory of self-coupled fermions - the N = 2 Gross-Neveu model. Based on this equivalence we note the existence of soliton defect states in polyacetylene that are additional to, and qualitatively different from, the amplitude kinks commonly discussed. The new solutions do not have the topological stability of kinks but are essentially conventional strong-coupling polarons in the dimerized chain. They carry spin (1/2) and charge (+- e). In addition, we discuss further areas in which known field theory results may apply to a Peierls-dimerized chain, including relations between phenomenological PHI 4 and continuuum electron-phonon models, and the structure of the fully quantum versus mean field theories. (orig.)
A survey on the modeling and applications of cellular automata theory
Gong, Yimin
2017-09-01
The Cellular Automata Theory is a discrete model which is now widely used in scientific researches and simulations. The model is comprised of some cells which changes according to a specific rule over time. This paper provides a survey of the Modeling and Applications of Cellular Automata Theory, which focus on the program realization of Cellular Automata Theory and the application of Cellular Automata in each field, such as road traffic, land use, and cutting machines. Each application is further explained, and several related main models are briefly introduced. This research aims to help decision-makers formulate appropriate development plans.
Continued development of modeling tools and theory for RF heating
International Nuclear Information System (INIS)
1998-01-01
Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project
Chernoff, Herman
1988-01-01
This well-respected introduction to statistics and statistical theory covers data processing, probability and random variables, utility and descriptive statistics, computation of Bayes strategies, models, testing hypotheses, and much more. 1959 edition.
Experimental Study of Dowel Bar Alternatives Based on Similarity Model Test
Directory of Open Access Journals (Sweden)
Chichun Hu
2017-01-01
Full Text Available In this study, a small-scaled accelerated loading test based on similarity theory and Accelerated Pavement Analyzer was developed to evaluate dowel bars with different materials and cross-sections. Jointed concrete specimen consisting of one dowel was designed as scaled model for the test, and each specimen was subjected to 864 thousand loading cycles. Deflections between jointed slabs were measured with dial indicators, and strains of the dowel bars were monitored with strain gauges. The load transfer efficiency, differential deflection, and dowel-concrete bearing stress for each case were calculated from these measurements. The test results indicated that the effect of the dowel modulus on load transfer efficiency can be characterized based on the similarity model test developed in the study. Moreover, round steel dowel was found to have similar performance to larger FRP dowel, and elliptical dowel can be preferentially considered in practice.
Introduction to zeolite theory and modelling
Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.
2001-01-01
A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the
Integrable models in 1+1 dimensional quantum field theory
International Nuclear Information System (INIS)
Faddeev, Ludvig.
1982-09-01
The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR
Vibration tests on some models of PEC reactor core elements
International Nuclear Information System (INIS)
Bonacina, G.; Castoldi, A.; Zola, M.; Cecchini, F.; Martelli, A.; Vincenzi, D.
1982-01-01
This paper describes the aims of the experimental tests carried out at ISMES, within an agreement with the Department of Fast Reactors of ENEA, on some models of the elements of PEC Fast Nuclear Reactor Core in the frame of the activities for the seismic verification of the PEC core. The seismic verification is briefly described with particular attention to the problems arising from the shocks among the various elements during an earthquake, as well as the computer code used, the purpose and the techniques used to perform tests, some results and the first comparison between the theory and the experimental data
Integrable lambda models and Chern-Simons theories
Energy Technology Data Exchange (ETDEWEB)
Schmidtt, David M. [Departamento de Física, Universidade Federal de São Carlos,Caixa Postal 676, CEP 13565-905, São Carlos-SP (Brazil)
2017-05-03
In this note we reveal a connection between the phase space of lambda models on S{sup 1}×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS{sub 5}×S{sup 5} lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.
A dynamical theory for the Rishon model
International Nuclear Information System (INIS)
Harari, H.; Seiberg, N.
1980-09-01
We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)
Self Modeling: Expanding the Theories of Learning
Dowrick, Peter W.
2012-01-01
Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…
International Nuclear Information System (INIS)
Liu, Yanfeng; Zhou, Xiaojun; Wang, Dengjia; Song, Cong; Liu, Jiaping
2015-01-01
Highlights: • Fractal theory is introduced into the prediction of VOC diffusion coefficient. • MSFC model of the diffusion coefficient is developed for porous building materials. • The MSFC model contains detailed pore structure parameters. • The accuracy of the MSFC model is verified by independent experiments. - Abstract: Most building materials are porous media, and the internal diffusion coefficients of such materials have an important influences on the emission characteristics of volatile organic compounds (VOCs). The pore structure of porous building materials has a significant impact on the diffusion coefficient. However, the complex structural characteristics bring great difficulties to the model development. The existing prediction models of the diffusion coefficient are flawed and need to be improved. Using scanning electron microscope (SEM) observations and mercury intrusion porosimetry (MIP) tests of typical porous building materials, this study developed a new diffusivity model: the multistage series-connection fractal capillary-bundle (MSFC) model. The model considers the variable-diameter capillaries formed by macropores connected in series as the main mass transfer paths, and the diameter distribution of the capillary bundles obeys a fractal power law in the cross section. In addition, the tortuosity of the macrocapillary segments with different diameters is obtained by the fractal theory. Mesopores serve as the connections between the macrocapillary segments rather than as the main mass transfer paths. The theoretical results obtained using the MSFC model yielded a highly accurate prediction of the diffusion coefficients and were in a good agreement with the VOC concentration measurements in the environmental test chamber.
Cant, Michael A; Llop, Justine B; Field, Jeremy
2006-06-01
Recent theory suggests that much of the wide variation in individual behavior that exists within cooperative animal societies can be explained by variation in the future direct component of fitness, or the probability of inheritance. Here we develop two models to explore the effect of variation in future fitness on social aggression. The models predict that rates of aggression will be highest toward the front of the queue to inherit and will be higher in larger, more productive groups. A third prediction is that, in seasonal animals, aggression will increase as the time available to inherit the breeding position runs out. We tested these predictions using a model social species, the paper wasp Polistes dominulus. We found that rates of both aggressive "displays" (aimed at individuals of lower rank) and aggressive "tests" (aimed at individuals of higher rank) decreased down the hierarchy, as predicted by our models. The only other significant factor affecting aggression rates was date, with more aggression observed later in the season, also as predicted. Variation in future fitness due to inheritance rank is the hidden factor accounting for much of the variation in aggressiveness among apparently equivalent individuals in this species.
Empirical tests of the Chicago model and the Easterlin hypothesis: a case study of Japan.
Ohbuchi, H
1982-05-01
The objective of this discussion is to test the applicability of economic theory of fertility with special reference to postwar Japan and to find a clue for forecasting the future trend of fertility. The theories examined are the "Chicago model" and the "Easterlin hypothesis." The major conclusion common among the leading economic theories of fertility, which have their origin with Gary S. Becker (1960, 1965) and Richard A. Easterlin (1966), is the positive income effect, i.e., that the relationship between income and fertility is positive despite the evidence that higher income families have fewer children and that fertility has declined with economic development. To bridge the gap between theory and fact is the primary purpose of the economic theory of fertility, and each offers a different interpretation for it. The point of the Chicago model, particularly of the household decision making model of the "new home economics," is the mechanism that a positive effect of husband's income growth on fertility is offset by a negative price effect caused by the opportunity cost of wife's time. While the opportunity cost of wife's time is independent of the female wage rate for an unemployed wife, it is directly associated with the wage rate for a gainfully employed wife. Thus, the fertility response to female wages occurs only among families with an employed wife. The primary concern of empirical efforts to test the Chicago model has been with the determination of income and price elasticities. An attempt is made to test the relevance of the Chicago model and the Easterlin hypothesis in explaning the fertility movement in postwar Japan. In case of the Chicago model, the statistical results appeared fairly successful but did not match with the theory. The effect on fertility of a rise in women's real wage (and, therefore in the opportunity cost of mother's time) and of a rise in labor force participation rate of married women of childbearing age in recent years could not
Testing the Ricardian trade theory
Kukuk, Martin
1990-01-01
The simple Ricardian model explains the comparative cost advantage by a relative productivity advantage of the single factor of production. This model is tested in this paper using microdata of the german business survey. In a first approach labour is being considered to be the only factor of production whereas in a second one capital is analysed. The results show that the former is able to explain the pattern of trade whereas the latter has no explanatory power. Therefore, labour productivit...
Membrane models and generalized Z2 gauge theories
International Nuclear Information System (INIS)
Lowe, M.J.; Wallace, D.J.
1980-01-01
We consider models of (d-n)-dimensional membranes fluctuating in a d-dimensional space under the action of surface tension. We investigate the renormalization properties of these models perturbatively and in 1/n expansion. The potential relationships of these models to generalized Z 2 gauge theories are indicated. (orig.)
Landmann, Helen; Hess, Ursula
2018-01-01
Moral foundation theory posits that specific moral transgressions elicit specific moral emotions. To test this claim, participants (N = 195) were asked to rate their emotions in response to moral violation vignettes. We found that compassion and disgust were associated with care and purity respectively as predicted by moral foundation theory.…
Integrability of a family of quantum field theories related to sigma models
Energy Technology Data Exchange (ETDEWEB)
Ridout, David [Australian National Univ., Canberra, ACT (Australia). Dept. of Theoretical Physics; DESY, Hamburg (Germany). Theory Group; Teschner, Joerg [DESY, Hamburg (Germany). Theory Group
2011-03-15
A method is introduced for constructing lattice discretizations of large classes of integrable quantum field theories. The method proceeds in two steps: The quantum algebraic structure underlying the integrability of the model is determined from the algebra of the interaction terms in the light-cone representation. The representation theory of the relevant quantum algebra is then used to construct the basic ingredients of the quantum inverse scattering method, the lattice Lax matrices and R-matrices. This method is illustrated with four examples: The Sinh-Gordon model, the affine sl(3) Toda model, a model called the fermionic sl(2 vertical stroke 1) Toda theory, and the N=2 supersymmetric Sine-Gordon model. These models are all related to sigma models in various ways. The N=2 supersymmetric Sine-Gordon model, in particular, describes the Pohlmeyer reduction of string theory on AdS{sub 2} x S{sup 2}, and is dual to a supersymmetric non-linear sigma model with a sausage-shaped target space. (orig.)
Models and theories of prescribing decisions: A review and suggested a new model
Mohaidin, Zurina
2017-01-01
To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701
Models and theories of prescribing decisions: A review and suggested a new model
Directory of Open Access Journals (Sweden)
Ali Murshid M
2017-06-01
Full Text Available To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.
Young, Myles D; Plotnikoff, Ronald C; Collins, Clare E; Callister, Robin; Morgan, Philip J
2016-11-01
Physical inactivity is a leading contributor to the burden of disease in men. Social-cognitive theories may improve physical activity (PA) interventions by identifying which variables to target to maximize intervention impact. This study tested the utility of Bandura's social cognitive theory (SCT) to explain men's PA during a 3-month weight loss program. Participants were 204 overweight/obese men (M [SD] age = 46.6 [11.3] years; body mass index = 33.1 [3.5] kg/m 2 ). A longitudinal, latent variable structural equation model tested the associations between SCT constructs (i.e., self-efficacy, outcome expectations, intention, and social support) and self-reported moderate-to-vigorous PA (MVPA) and examined the total PA variance explained by SCT. After controlling for Time 1 cognitions and behavior, the model fit the data well (χ 2 = 73.9, degrees of freedom = 39, p social support. This study provides some evidence supporting the tenets of SCT when examining PA behavior in overweight and obese men. Future PA and weight loss interventions for men may benefit by targeting self-efficacy and intention, but the utility of targeting social support and outcome expectations requires further examination. © The Author(s) 2015.
Testing the Validity of a Cognitive Behavioral Model for Gambling Behavior.
Raylu, Namrata; Oei, Tian Po S; Loo, Jasmine M Y; Tsai, Jung-Shun
2016-06-01
Currently, cognitive behavioral therapies appear to be one of the most studied treatments for gambling problems and studies show it is effective in treating gambling problems. However, cognitive behavior models have not been widely tested using statistical means. Thus, the aim of this study was to test the validity of the pathways postulated in the cognitive behavioral theory of gambling behavior using structural equation modeling (AMOS 20). Several questionnaires assessing a range of gambling specific variables (e.g., gambling urges, cognitions and behaviors) and gambling correlates (e.g., psychological states, and coping styles) were distributed to 969 participants from the community. Results showed that negative psychological states (i.e., depression, anxiety and stress) only directly predicted gambling behavior, whereas gambling urges predicted gambling behavior directly as well as indirectly via gambling cognitions. Avoidance coping predicted gambling behavior only indirectly via gambling cognitions. Negative psychological states were significantly related to gambling cognitions as well as avoidance coping. In addition, significant gender differences were also found. The results provided confirmation for the validity of the pathways postulated in the cognitive behavioral theory of gambling behavior. It also highlighted the importance of gender differences in conceptualizing gambling behavior.
Brackenbury, Tim; Zickar, Michael J.; Munson, Benjamin; Storkel, Holly L.
2017-01-01
Purpose: Item response theory (IRT) is a psychometric approach to measurement that uses latent trait abilities (e.g., speech sound production skills) to model performance on individual items that vary by difficulty and discrimination. An IRT analysis was applied to preschoolers' productions of the words on the Goldman-Fristoe Test of…
Theory, modeling and simulation: Annual report 1993
Energy Technology Data Exchange (ETDEWEB)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.
Theory, modeling and simulation: Annual report 1993
International Nuclear Information System (INIS)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies
Soto, Fabian A; Zheng, Emily; Fonseca, Johnny; Ashby, F Gregory
2017-01-01
Determining whether perceptual properties are processed independently is an important goal in perceptual science, and tools to test independence should be widely available to experimental researchers. The best analytical tools to test for perceptual independence are provided by General Recognition Theory (GRT), a multidimensional extension of signal detection theory. Unfortunately, there is currently a lack of software implementing GRT analyses that is ready-to-use by experimental psychologists and neuroscientists with little training in computational modeling. This paper presents grtools , an R package developed with the explicit aim of providing experimentalists with the ability to perform full GRT analyses using only a couple of command lines. We describe the software and provide a practical tutorial on how to perform each of the analyses available in grtools . We also provide advice to researchers on best practices for experimental design and interpretation of results when applying GRT and grtools .
Directory of Open Access Journals (Sweden)
Fabian A. Soto
2017-05-01
Full Text Available Determining whether perceptual properties are processed independently is an important goal in perceptual science, and tools to test independence should be widely available to experimental researchers. The best analytical tools to test for perceptual independence are provided by General Recognition Theory (GRT, a multidimensional extension of signal detection theory. Unfortunately, there is currently a lack of software implementing GRT analyses that is ready-to-use by experimental psychologists and neuroscientists with little training in computational modeling. This paper presents grtools, an R package developed with the explicit aim of providing experimentalists with the ability to perform full GRT analyses using only a couple of command lines. We describe the software and provide a practical tutorial on how to perform each of the analyses available in grtools. We also provide advice to researchers on best practices for experimental design and interpretation of results when applying GRT and grtools
Theory and Low-Order Modeling of Unsteady Airfoil Flows
Ramesh, Kiran
Unsteady flow phenomena are prevalent in a wide range of problems in nature and engineering. These include, but are not limited to, aerodynamics of insect flight, dynamic stall in rotorcraft and wind turbines, leading-edge vortices in delta wings, micro-air vehicle (MAV) design, gust handling and flow control. The most significant characteristics of unsteady flows are rapid changes in the circulation of the airfoil, apparent-mass effects, flow separation and the leading-edge vortex (LEV) phenomenon. Although experimental techniques and computational fluid dynamics (CFD) methods have enabled the detailed study of unsteady flows and their underlying features, a reliable and inexpensive loworder method for fast prediction and for use in control and design is still required. In this research, a low-order methodology based on physical principles rather than empirical fitting is proposed. The objective of such an approach is to enable insights into unsteady phenomena while developing approaches to model them. The basis of the low-order model developed here is unsteady thin-airfoil theory. A time-stepping approach is used to solve for the vorticity on an airfoil camberline, allowing for large amplitudes and nonplanar wakes. On comparing lift coefficients from this method against data from CFD and experiments for some unsteady test cases, it is seen that the method predicts well so long as LEV formation does not occur and flow over the airfoil is attached. The formation of leading-edge vortices (LEVs) in unsteady flows is initiated by flow separation and the formation of a shear layer at the airfoil's leading edge. This phenomenon has been observed to have both detrimental (dynamic stall in helicopters) and beneficial (high-lift flight in insects) effects. To predict the formation of LEVs in unsteady flows, a Leading Edge Suction Parameter (LESP) is proposed. This parameter is calculated from inviscid theory and is a measure of the suction at the airfoil's leading edge. It
Machado, Armando; Pata, Paulo
2005-02-01
Two theories of timing, scalar expectancy theory (SET) and learning-to-time (LeT), make substantially different assumptions about what animals learn in temporal tasks. In a test of these assumptions, pigeons learned two temporal discriminations. On Type 1 trials, they learned to choose a red key after a 1-sec signal and a green key after a 4-sec signal; on Type 2 trials, they learned to choose a blue key after a 4-sec signal and a yellow key after either an 8-sec signal (Group 8) or a 16-sec signal (Group 16). Then, the birds were exposed to signals 1 sec, 4 sec, and 16 sec in length and given a choice between novel key combinations (red or green vs. blue or yellow). The choice between the green key and the blue key was of particular significance because both keys were associated with the same 4-sec signal. Whereas SET predicted no effect of the test signal duration on choice, LeT predicted that preference for green would increase monotonically with the length of the signal but would do so faster for Group 8 than for Group 16. The results were consistent with LeT, but not with SET.
Testing the entrepreneurial intention model on a two-country sample
Liñán, Francisco
2006-01-01
This paper tests the Entrepreneurial Intention Model -which is adapted from the Theory of Planned Behavior- on a sample of 533 individuals from two quite different countries: one of them European (Spain) and the other South Asian (Taiwan). A newly developed Entrepreneurial Intention Questionnaire (EIQ) has being used which tries to overcome some of the limitations of previous instruments. Structural equations techniques were used in the empirical analysis. Results are generally satisfactory, ...
Fundamental constants and tests of theory in Rydberg states of hydrogenlike ions.
Jentschura, Ulrich D; Mohr, Peter J; Tan, Joseph N; Wundt, Benedikt J
2008-04-25
A comparison of precision frequency measurements to quantum electrodynamics (QED) predictions for Rydberg states of hydrogenlike ions can yield information on values of fundamental constants and test theory. With the results of a calculation of a key QED contribution reported here, the uncertainty in the theory of the energy levels is reduced to a level where such a comparison can yield an improved value of the Rydberg constant.
Fundamental Constants and Tests of Theory in Rydberg States of Hydrogenlike Ions
International Nuclear Information System (INIS)
Jentschura, Ulrich D.; Mohr, Peter J.; Tan, Joseph N.; Wundt, Benedikt J.
2008-01-01
A comparison of precision frequency measurements to quantum electrodynamics (QED) predictions for Rydberg states of hydrogenlike ions can yield information on values of fundamental constants and test theory. With the results of a calculation of a key QED contribution reported here, the uncertainty in the theory of the energy levels is reduced to a level where such a comparison can yield an improved value of the Rydberg constant
Testing the tenets of minority stress theory in workplace contexts.
Velez, Brandon L; Moradi, Bonnie; Brewster, Melanie E
2013-10-01
The links of minority stressors (workplace discrimination, expectations of stigma, internalized heterosexism, and identity management strategies) with psychological distress and job satisfaction were examined in a sample of 326 sexual minority employees. Drawing from minority stress theory and the literature on the vocational experiences of sexual minority people, patterns of mediation and moderation were tested. Minority stressors were associated with greater distress and lower job satisfaction. A mediation model was supported in which the links of discrimination and internalized heterosexism with psychological distress were mediated by a concealment-focused identity management strategy (i.e., avoiding), and the links of discrimination, expectations of stigma, and internalized heterosexism with job satisfaction were mediated by a disclosure-focused identity management strategy (i.e., integrating). Tests of moderation indicated that for sexual minority women (but not men), the positive association of discrimination with distress was stronger at higher levels of internalized heterosexism than at lower levels. In addition, lower levels of internalized heterosexism and concealment strategies (i.e., counterfeiting and avoiding) and higher levels of a disclosure strategy (i.e., integrating) were associated with higher job satisfaction in the context of low discrimination, but this buffering effect disappeared as level of discrimination increased. The implications of these findings for minority stress research are discussed, and clinical recommendations are made.
f(R) gravity and chameleon theories
International Nuclear Information System (INIS)
Brax, Philippe; Bruck, Carsten van de; Davis, Anne-Christine; Shaw, Douglas J.
2008-01-01
We analyze f(R) modifications of Einstein's gravity as dark energy models in the light of their connection with chameleon theories. Formulated as scalar-tensor theories, the f(R) theories imply the existence of a strong coupling of the scalar field to matter. This would violate all experimental gravitational tests on deviations from Newton's law. Fortunately, the existence of a matter dependent mass and a thin-shell effect allows one to alleviate these constraints. The thin-shell condition also implies strong restrictions on the cosmological dynamics of the f(R) theories. As a consequence, we find that the equation of state of dark energy is constrained to be extremely close to -1 in the recent past. We also examine the potential effects of f(R) theories in the context of the Eoet-wash experiments. We show that the requirement of a thin shell for the test bodies is not enough to guarantee a null result on deviations from Newton's law. As long as dark energy accounts for a sizeable fraction of the total energy density of the Universe, the constraints that we deduce also forbid any measurable deviation of the dark energy equation of state from -1. All in all, we find that both cosmological and laboratory tests imply that f(R) models are almost coincident with a ΛCDM model at the background level.
Testing the Market Model – A Case Study of Fondul Proprietatea (FP)
Sorin Claudiu Radu
2014-01-01
The financial theory related to the bond portfolio analysis was coined by Harry Markowitz, an authentic’ pioneer of the modern bond theory’, and his well-thought interpretation of the bond selection model may be found in his research papers “Portfolio Selection” (Markowitz M. Harry, 1952) and “Portfolio Selection: Efficient Diversification of Investments” (Markowitz M. Harry 1960). This paper is proposed to test the market model in the Romanian stock market, case of Property Fund.
Superfield theory and supermatrix model
International Nuclear Information System (INIS)
Park, Jeong-Hyuck
2003-01-01
We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)
International Nuclear Information System (INIS)
Randjbar-Daemi, S.
1987-01-01
The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix
Classical testing particles and (4 + N)-dimensional theories of space-time
International Nuclear Information System (INIS)
Nieto-Garcia, J.A.
1986-01-01
The Lagrangian theory of a classical relativistic spinning test particle (top) developed by Hanson and Regge and by Hojman is briefly reviewed. Special attention is devoted to the constraints imposed on the dynamical variables associated with the system of this theory. The equations for a relativistic top are formulated in a way suitable for use in the study of geometrical properties of the 4 + N-dimensional Kaluza-Klein background. It is shown that the equations of motion of a top in five dimensions reduce to the Hanson-Regge generalization of the Bargmann-Michel-Telegdi equations of motion in four dimensions when suitable conditions on the spin tensor are imposed. The classical bosonic relativistic string theory is discussed and the connection of this theory with the top theory is examined. It is found that the relation between the string and the top leads naturally to the consideration of a 3-dimensional extended system (called terron) which sweeps out a 4-dimensional surface as it evolves in a space-time. By using a square root procedure based on ideas by Teitelboim a theory of a supersymmetric top is developed. The quantization of the new supersymmetric system is discussed. Conclusions and suggestions for further research are given
Quantum field theory and the standard model
Schwartz, Matthew D
2014-01-01
Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...
Kunicki, Zachary J; Schick, Melissa R; Spillane, Nichea S; Harlow, Lisa L
2018-06-01
Those who binge drink are at increased risk for alcohol-related consequences when compared to non-binge drinkers. Research shows individuals may face barriers to reducing their drinking behavior, but few measures exist to assess these barriers. This study created and validated the Barriers to Alcohol Reduction (BAR) scale. Participants were college students ( n = 230) who endorsed at least one instance of past-month binge drinking (4+ drinks for women or 5+ drinks for men). Using classical test theory, exploratory structural equation modeling found a two-factor structure of personal/psychosocial barriers and perceived program barriers. The sub-factors, and full scale had reasonable internal consistency (i.e., coefficient omega = 0.78 (personal/psychosocial), 0.82 (program barriers), and 0.83 (full measure)). The BAR also showed evidence for convergent validity with the Brief Young Adult Alcohol Consequences Questionnaire ( r = 0.39, p Theory (IRT) analysis showed the two factors separately met the unidimensionality assumption, and provided further evidence for severity of the items on the two factors. Results suggest that the BAR measure appears reliable and valid for use in an undergraduate student population of binge drinkers. Future studies may want to re-examine this measure in a more diverse sample.
Evaluation of Northwest University, Kano Post-UTME Test Items Using Item Response Theory
Bichi, Ado Abdu; Hafiz, Hadiza; Bello, Samira Abdullahi
2016-01-01
High-stakes testing is used for the purposes of providing results that have important consequences. Validity is the cornerstone upon which all measurement systems are built. This study applied the Item Response Theory principles to analyse Northwest University Kano Post-UTME Economics test items. The developed fifty (50) economics test items was…
Diagrammatic group theory in quark models
International Nuclear Information System (INIS)
Canning, G.P.
1977-05-01
A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de
Chaos Theory as a Model for Managing Issues and Crises.
Murphy, Priscilla
1996-01-01
Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…
Discrete state moduli of string theory from c=1 matrix model
Dhar, A; Wadia, S R; Dhar, Avinash; Mandal, Gautam; Wadia, Spenta R
1995-01-01
We propose a new formulation of the space-time interpretation of the c=1 matrix model. Our formulation uses the well-known leg-pole factor that relates the matrix model amplitudes to that of the 2-dimensional string theory, but includes fluctuations around the fermi vacuum on {\\sl both sides} of the inverted harmonic oscillator potential of the double-scaled model, even when the fluctuations are small and confined entirely within the asymptotes in the phase plane. We argue that including fluctuations on both sides of the potential is essential for a consistent interpretation of the leg-pole transformed theory as a theory of space-time gravity. We reproduce the known results for the string theory tree level scattering amplitudes for flat space and linear dilaton background as a special case. We show that the generic case corresponds to more general space-time backgrounds. In particular, we identify the parameter corresponding to background metric perturbation in string theory (black hole mass) in terms of the ...
Nonperturbative type IIB model building in the F-theory framework
Energy Technology Data Exchange (ETDEWEB)
Jurke, Benjamin Helmut Friedrich
2011-02-28
-realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)
Nonperturbative type IIB model building in the F-theory framework
International Nuclear Information System (INIS)
Jurke, Benjamin Helmut Friedrich
2011-01-01
-realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)
Cirafici, M.; Sinkovics, A.; Szabo, R.J.
2009-01-01
We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques
Recent experiments testing an opponent-process theory of acquired motivation.
Solomon, R L
1980-01-01
There are acquired motives of the addiction type which seem to be non-associative in nature. They all seem to involve affective phenomena caused by reinforcers, unconditioned stimuli or innate releasers. When such stimuli are repeatedly presented, at least three affective phenomena occur: (1) affective contrast effects, (2) affective habituation (tolerance), and (3) affective withdrawal syndromes. These phenomena can be precipitated either by pleasant or unpleasant events (positive or negative reinforcers). Whenever we see these three phenomena, we also see the development of an addictive cycle, a new motivational system. These phenomena are explained by an opponent-process theory of motivation which holds that there are affect control systems which oppose large departures from affective equilibrium. The control systems are strengthened by use and weakened by disuse. Current observations and experiments testing the theory are described for: (1) the growth of social attachment (imprinting) in ducklings; and (2) the growth of adjunctive behaviors. The findings so far support the theory.
The monster sporadic group and a theory underlying superstring models
International Nuclear Information System (INIS)
Chapline, G.
1996-09-01
The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs
Effective potential in Lorentz-breaking field theory models
Energy Technology Data Exchange (ETDEWEB)
Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)
2017-12-15
We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)
Effective potential in Lorentz-breaking field theory models
International Nuclear Information System (INIS)
Baeta Scarpelli, A.P.; Brito, L.C.T.; Felipe, J.C.C.; Nascimento, J.R.; Petrov, A.Yu.
2017-01-01
We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)
Noncommutative gauge theory and symmetry breaking in matrix models
International Nuclear Information System (INIS)
Grosse, Harald; Steinacker, Harold; Lizzi, Fedele
2010-01-01
We show how the fields and particles of the standard model can be naturally realized in noncommutative gauge theory. Starting with a Yang-Mills matrix model in more than four dimensions, an SU(n) gauge theory on a Moyal-Weyl space arises with all matter and fields in the adjoint of the gauge group. We show how this gauge symmetry can be broken spontaneously down to SU(3) c xSU(2) L xU(1) Q [resp. SU(3) c xU(1) Q ], which couples appropriately to all fields in the standard model. An additional U(1) B gauge group arises which is anomalous at low energies, while the trace-U(1) sector is understood in terms of emergent gravity. A number of additional fields arise, which we assume to be massive, in a pattern that is reminiscent of supersymmetry. The symmetry breaking might arise via spontaneously generated fuzzy spheres, in which case the mechanism is similar to brane constructions in string theory.
Off-critical statistical models: factorized scattering theories and bootstrap program
International Nuclear Information System (INIS)
Mussardo, G.
1992-01-01
We analyze those integrable statistical systems which originate from some relevant perturbations of the minimal models of conformal field theories. When only massive excitations are present, the systems can be efficiently characterized in terms of the relativistic scattering data. We review the general properties of the factorizable S-matrix in two dimensions with particular emphasis on the bootstrap principle. The classification program of the allowed spins of conserved currents and of the non-degenerate S-matrices is discussed and illustrated by means of some significant examples. The scattering theories of several massive perturbations of the minimal models are fully discussed. Among them are the Ising model, the tricritical Ising model, the Potts models, the series of the non-unitary minimal models M 2,2n+3 , the non-unitary model M 3,5 and the scaling limit of the polymer system. The ultraviolet limit of these massive integrable theories can be exploited by the thermodynamics Bethe ansatz, in particular the central charge of the original conformal theories can be recovered from the scattering data. We also consider the numerical method based on the so-called conformal space truncated approach which confirms the theoretical results and allows a direct measurement of the scattering data, i.e. the masses and the S-matrix of the particles in bootstrap interaction. The problem of computing the off-critical correlation functions is discussed in terms of the form-factor approach
International Nuclear Information System (INIS)
Callen, J.D.; Dory, R.A.; Aghevli, R.
1977-01-01
The progress during the past year is organized by group efforts and divided into five major areas. The basic tokamak areas and the sections in which their work is summarized are: magnetohydrodynamic (MHD) theory, kinetic theory, and transport simulation. The ELMO Bumpy Torus (EBT) theory work has its own research projects on MHD theory, kinetic theory, and transport simulation. In the plasma engineering area, relevant research work is further developed and synthesized into models that are used in the design of advanced fusion systems--The Next Step (TNS), demonstration fusion reactor (Demo), EBT ignition test, etc. Specific plasma engineering projects on providing the TNS physics basis and the development of the EBT reactor study are discussed. The computing support activities during the past year are summarized
The Scientific Theory Profile: A Philosophy of Science Model for Science Teachers.
Loving, Cathleen
The model developed for use with science teachers--called the Scientific Theory Profile--consists of placing three well-known philosophers of science on a grid, with the x-axis being their methods for judging theories (rational vs. natural) and the y-axis being their views on scientific theories representing the Truth versus mere models of what…
Reinisch, Bianca; Krüger, Dirk
2018-02-01
In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.
The Self-Perception Theory vs. a Dynamic Learning Model
Swank, Otto H.
2006-01-01
Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...
Optimal velocity difference model for a car-following theory
International Nuclear Information System (INIS)
Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.
2011-01-01
In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.
Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.
2014-01-01
Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753
Models versus theories as a primary carrier of nursing knowledge: A philosophical argument.
Bender, Miriam
2018-01-01
Theories and models are not equivalent. I argue that an orientation towards models as a primary carrier of nursing knowledge overcomes many ongoing challenges in philosophy of nursing science, including the theory-practice divide and the paradoxical pursuit of predictive theories in a discipline that is defined by process and a commitment to the non-reducibility of the health/care experience. Scientific models describe and explain the dynamics of specific phenomenon. This is distinct from theory, which is traditionally defined as propositions that explain and/or predict the world. The philosophical case has been made against theoretical universalism, showing that a theory can be true in its domain, but that no domain is universal. Subsequently, philosophers focused on scientific models argued that they do the work of defining the boundary conditions-the domain(s)-of a theory. Further analysis has shown the ways models can be constructed and function independent of theory, meaning models can comprise distinct, autonomous "carriers of scientific knowledge." Models are viewed as representations of the active dynamics, or mechanisms, of a phenomenon. Mechanisms are entities and activities organized such that they are productive of regular changes. Importantly, mechanisms are by definition not static: change may alter the mechanism and thereby alter or create entirely new phenomena. Orienting away from theory, and towards models, focuses scholarly activity on dynamics and change. This makes models arguably critical to nursing science, enabling the production of actionable knowledge about the dynamics of process and change in health/care. I briefly explore the implications for nursing-and health/care-knowledge and practice. © 2017 John Wiley & Sons Ltd.
Method to determine the optimal constitutive model from spherical indentation tests
Directory of Open Access Journals (Sweden)
Tairui Zhang
2018-03-01
Full Text Available The limitation of current indentation theories was investigated and a method to determine the optimal constitutive model through spherical indentation tests was proposed. Two constitutive models, the Power-law and the Linear-law, were used in Finite Element (FE calculations, and then a set of indentation governing equations was established for each model. The load-depth data from the normal indentation depth was used to fit the best parameters in each constitutive model while the data from the further loading part was compared with those from FE calculations, and the model that better predicted the further deformation was considered the optimal one. Moreover, a Yang’s modulus calculation model which took the previous plastic deformation and the phenomenon of pile-up (or sink-in into consideration was also proposed to revise the original Sneddon-Pharr-Oliver model. The indentation results on six materials, 304, 321, SA508, SA533, 15CrMoR, and Fv520B, were compared with tensile ones, which validated the reliability of the revised E calculation model and the optimal constitutive model determination method in this study. Keywords: Optimal constitutive model, Spherical indentation test, Finite Element calculations, Yang’s modulus
Lenses on Reading An Introduction to Theories and Models
Tracey, Diane H
2012-01-01
This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition
Testing gravity with EG: mapping theory onto observations
Leonard, C. Danielle; Ferreira, Pedro G.; Heymans, Catherine
2015-12-01
We present a complete derivation of the observationally motivated definition of the modified gravity statistic EG. Using this expression, we investigate how variations to theory and survey parameters may introduce uncertainty in the general relativistic prediction of EG. We forecast errors on EG for measurements using two combinations of upcoming surveys, and find that theoretical uncertainties may dominate for a futuristic measurement. Finally, we compute predictions of EG under modifications to general relativity in the quasistatic regime, and comment on the pros and cons of using EG to test gravity with future surveys.
Precision tests and fine tuning in twin Higgs models
Contino, Roberto; Greco, Davide; Mahbubani, Rakhi; Rattazzi, Riccardo; Torre, Riccardo
2017-11-01
We analyze the parametric structure of twin Higgs (TH) theories and assess the gain in fine tuning which they enable compared to extensions of the standard model with colored top partners. Estimates show that, at least in the simplest realizations of the TH idea, the separation between the mass of new colored particles and the electroweak scale is controlled by the coupling strength of the underlying UV theory, and that a parametric gain is achieved only for strongly-coupled dynamics. Motivated by this consideration we focus on one of these simple realizations, namely composite TH theories, and study how well such constructions can reproduce electroweak precision data. The most important effect of the twin states is found to be the infrared contribution to the Higgs quartic coupling, while direct corrections to electroweak observables are subleading and negligible. We perform a careful fit to the electroweak data including the leading-logarithmic corrections to the Higgs quartic up to three loops. Our analysis shows that agreement with electroweak precision tests can be achieved with only a moderate amount of tuning, in the range 5%-10%, in theories where colored states have mass of order 3-5 TeV and are thus out of reach of the LHC. For these levels of tuning, larger masses are excluded by a perturbativity bound, which makes these theories possibly discoverable, hence falsifiable, at a future 100 TeV collider.
Induction of depressed mood: a test of opponent-process theory.
Ranieri, D J; Zeiss, A M
1984-12-01
Solomon's (1980) opponent-process theory of acquired motivation has been used to explain many phenomena in which affective or hedonic contrasts appear to exist, but has not been applied to the induction of depressed mood. The purpose of this study, therefore, was to determine whether opponent-process theory can be applied to this area. Velten's (1968) mood-induction procedure was used and subjects were assigned either to a depression-induction condition or to one of two control groups. Self-report measures of depressed mood were taken before, during, and at several points after the mood induction. Results were not totally consistent with a rigorous set of criteria for supporting an opponent-process interpretation. This suggests that the opponent-process model may not be applicable to induced depressed mood. Possible weaknesses in the experimental design, along with implications for opponent-process theory, are discussed.
Corvid re-caching without 'theory of mind': a model.
van der Vaart, Elske; Verbrugge, Rineke; Hemelrijk, Charlotte K
2012-01-01
Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.
Corvid re-caching without 'theory of mind': a model.
Directory of Open Access Journals (Sweden)
Elske van der Vaart
Full Text Available Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.
A study of the logical model of capital market complexity theories
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.
Collective learning modeling based on the kinetic theory of active particles
Burini, D.; De Lillo, S.; Gibelli, L.
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.
Non-integrable quantum field theories as perturbations of certain integrable models
International Nuclear Information System (INIS)
Delfino, G.; Simonetti, P.
1996-03-01
We approach the study of non-integrable models of two-dimensional quantum field theory as perturbations of the integrable ones. By exploiting the knowledge of the exact S-matrix and Form Factors of the integrable field theories we obtain the first order corrections to the mass ratios, the vacuum energy density and the S-matrix of the non-integrable theories. As interesting applications of the formalism, we study the scaling region of the Ising model in an external magnetic field at T ∼ T c and the scaling region around the minimal model M 2 , τ . For these models, a remarkable agreement is observed between the theoretical predictions and the data extracted by a numerical diagonalization of their Hamiltonian. (author). 41 refs, 9 figs, 1 tab
Classical nucleation theory in the phase-field crystal model.
Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas
2018-04-01
A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.
Classical nucleation theory in the phase-field crystal model
Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas
2018-04-01
A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.
Kassavou, Aikaterini; Turner, Andrew; Hamborg, Thomas; French, David P
2014-07-01
Little is known about the processes and factors that account for maintenance, with several theories existing that have not been subject to many empirical tests. The aim of this study was to test how well theoretical constructs derived from the Health Action Process Approach, Rothman's theory of maintenance, and Verplanken's approach to habitual behavior predicted maintenance of attendance at walking groups. 114 participants, who had already attended walking groups in the community for at least 3 months, completed a questionnaire assessing theoretical constructs regarding maintenance. An objective assessment of attendance over the subsequent 3 months was gained. Multilevel modeling was used to predict maintenance, controlling for clustering within walking groups. Recovery self-efficacy predicted maintenance, even after accounting for clustering. Satisfaction with social outcomes, satisfaction with health outcomes, and overall satisfaction predicted maintenance, but only satisfaction with health outcomes significantly predicted maintenance after accounting for clustering. Self-reported habitual behavior did not predict maintenance despite mean previous attendance being 20.7 months. Recovery self-efficacy, and satisfaction with health outcomes of walking group attendance appeared to be important for objectively measured maintenance, whereas self-reported habit appeared not to be important for maintenance at walking groups. The findings suggest that there is a need for intervention studies to boost recovery self-efficacy and satisfaction with outcomes of walking group attendance, to assess impact on maintenance.
Cohn, Amy M.; Hagman, Brett T.; Graff, Fiona S.; Noel, Nora E.
2011-01-01
Objective: The present study examined the latent continuum of alcohol-related negative consequences among first-year college women using methods from item response theory and classical test theory. Method: Participants (N = 315) were college women in their freshman year who reported consuming any alcohol in the past 90 days and who completed assessments of alcohol consumption and alcohol-related negative consequences using the Rutgers Alcohol Problem Index. Results: Item response theory analyses showed poor model fit for five items identified in the Rutgers Alcohol Problem Index. Two-parameter item response theory logistic models were applied to the remaining 18 items to examine estimates of item difficulty (i.e., severity) and discrimination parameters. The item difficulty parameters ranged from 0.591 to 2.031, and the discrimination parameters ranged from 0.321 to 2.371. Classical test theory analyses indicated that the omission of the five misfit items did not significantly alter the psychometric properties of the construct. Conclusions: Findings suggest that those consequences that had greater severity and discrimination parameters may be used as screening items to identify female problem drinkers at risk for an alcohol use disorder. PMID:22051212
Malekian, Negin; Habibi, Jafar; Zangooei, Mohammad Hossein; Aghakhani, Hojjat
2016-11-01
There are many cells with various phenotypic behaviors in cancer interacting with each other. For example, an apoptotic cell may induce apoptosis in adjacent cells. A living cell can also protect cells from undergoing apoptosis and necrosis. These survival and death signals are propagated through interaction pathways between adjacent cells called gap junctions. The function of these signals depends on the cellular context of the cell receiving them. For instance, a receiver cell experiencing a low level of oxygen may interpret a received survival signal as an apoptosis signal. In this study, we examine the effect of these signals on tumor growth. We make an evolutionary game theory component in order to model the signal propagation through gap junctions. The game payoffs are defined as a function of cellular context. Then, the game theory component is integrated into an agent-based model of tumor growth. After that, the integrated model is applied to ductal carcinoma in situ, a type of early stage breast cancer. Different scenarios are explored to observe the impact of the gap junction communication and parameters of the game theory component on cancer progression. We compare these scenarios by using the Wilcoxon signed-rank test. The Wilcoxon signed-rank test succeeds in proving a significant difference between the tumor growth of the model before and after considering the gap junction communication. The Wilcoxon signed-rank test also proves that the tumor growth significantly depends on the oxygen threshold of turning survival signals into apoptosis. In this study, the gap junction communication is modeled by using evolutionary game theory to illustrate its role at early stage cancers such as ductal carcinoma in situ. This work indicates that the gap junction communication and the oxygen threshold of turning survival signals into apoptosis can notably affect cancer progression. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Soliton excitations in a class of nonlinear field theory models
International Nuclear Information System (INIS)
Makhan'kov, V.G.; Fedyanin, V.K.
1985-01-01
Investigation results of nonlinear models of the field theory with a lagrangian are described. The theory includes models both with zero stable vacuum epsilon=1 and with condensate epsilon=-1 (of disturbed symmetry). Conditions of existence of particle-like solutions (PLS), stability of these solutions are investigated. Soliton dynamics is studied. PLS formfactors are calculated. Statistical mechanics of solitons is built and their dynamic structure factors are calculated
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.
Competition for light between phytoplankton species : Experimental tests of mechanistic theory
Huisman, J.; Jonker, R.R.; Zonneveld, C.; Weissing, F.J.
1999-01-01
According to recent competition theory, the population dynamics of phytoplankton species in monoculture can be used to make a priori predictions of the dynamics and outcome of competition for light. The species with lowest "critical light intensity" should be the superior light competitor. To test
Robust and distributed hypothesis testing
Gül, Gökhan
2017-01-01
This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...
Mutual diffusion coefficient models for polymer-solvent systems based on the Chapman-Enskog theory
Directory of Open Access Journals (Sweden)
R. A. Reis
2004-12-01
Full Text Available There are numerous examples of the importance of small molecule migration in polymeric materials, such as in drying polymeric packing, controlled drug delivery, formation of films, and membrane separation, etc. The Chapman-Enskog kinetic theory of hard-sphere fluids with the Weeks-Chandler-Andersen effective hard-sphere diameter (Enskog-WCA has been the most fruitful in diffusion studies of simple fluids and mixtures. In this work, the ability of the Enskog-WCA model to describe the temperature and concentration dependence of the mutual diffusion coefficient, D, for a polystyrene-toluene system was evaluated. Using experimental diffusion data, two polymer model approaches and three mixing rules for the effective hard-sphere diameter were tested. Some procedures tested resulted in models that are capable of correlating the experimental data with the refereed system well for a solvent mass fraction greater than 0.3.
Directory of Open Access Journals (Sweden)
Ina Schieferdecker
2012-02-01
Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.
Testing the renormalisation group theory of cooperative transitions at the lambda point of helium
Lipa, J. A.; Li, Q.; Chui, T. C. P.; Marek, D.
1988-01-01
The status of high resolution tests of the renormalization group theory of cooperative phase transitions performed near the lambda point of helium is described. The prospects for performing improved tests in space are discussed.
Magnetic flux tube models in superstring theory
Russo, Jorge G
1996-01-01
Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...
The theory of planned behavior (TPB) has received its fair share of criticism lately, including calls for it to retire. We contributed to improving the theory by testing extensions such as the model of goal-directed behavior (MGDB, which adds desire and anticipated positive and negative emotions) ap...
Dual Processing Model for Medical Decision-Making: An Extension to Diagnostic Testing.
Tsalatsanis, Athanasios; Hozo, Iztok; Kumar, Ambuj; Djulbegovic, Benjamin
2015-01-01
Dual Processing Theories (DPT) assume that human cognition is governed by two distinct types of processes typically referred to as type 1 (intuitive) and type 2 (deliberative). Based on DPT we have derived a Dual Processing Model (DPM) to describe and explain therapeutic medical decision-making. The DPM model indicates that doctors decide to treat when treatment benefits outweigh its harms, which occurs when the probability of the disease is greater than the so called "threshold probability" at which treatment benefits are equal to treatment harms. Here we extend our work to include a wider class of decision problems that involve diagnostic testing. We illustrate applicability of the proposed model in a typical clinical scenario considering the management of a patient with prostate cancer. To that end, we calculate and compare two types of decision-thresholds: one that adheres to expected utility theory (EUT) and the second according to DPM. Our results showed that the decisions to administer a diagnostic test could be better explained using the DPM threshold. This is because such decisions depend on objective evidence of test/treatment benefits and harms as well as type 1 cognition of benefits and harms, which are not considered under EUT. Given that type 1 processes are unique to each decision-maker, this means that the DPM threshold will vary among different individuals. We also showed that when type 1 processes exclusively dominate decisions, ordering a diagnostic test does not affect a decision; the decision is based on the assessment of benefits and harms of treatment. These findings could explain variations in the treatment and diagnostic patterns documented in today's clinical practice.
Dual Processing Model for Medical Decision-Making: An Extension to Diagnostic Testing.
Directory of Open Access Journals (Sweden)
Athanasios Tsalatsanis
Full Text Available Dual Processing Theories (DPT assume that human cognition is governed by two distinct types of processes typically referred to as type 1 (intuitive and type 2 (deliberative. Based on DPT we have derived a Dual Processing Model (DPM to describe and explain therapeutic medical decision-making. The DPM model indicates that doctors decide to treat when treatment benefits outweigh its harms, which occurs when the probability of the disease is greater than the so called "threshold probability" at which treatment benefits are equal to treatment harms. Here we extend our work to include a wider class of decision problems that involve diagnostic testing. We illustrate applicability of the proposed model in a typical clinical scenario considering the management of a patient with prostate cancer. To that end, we calculate and compare two types of decision-thresholds: one that adheres to expected utility theory (EUT and the second according to DPM. Our results showed that the decisions to administer a diagnostic test could be better explained using the DPM threshold. This is because such decisions depend on objective evidence of test/treatment benefits and harms as well as type 1 cognition of benefits and harms, which are not considered under EUT. Given that type 1 processes are unique to each decision-maker, this means that the DPM threshold will vary among different individuals. We also showed that when type 1 processes exclusively dominate decisions, ordering a diagnostic test does not affect a decision; the decision is based on the assessment of benefits and harms of treatment. These findings could explain variations in the treatment and diagnostic patterns documented in today's clinical practice.
Mean-field theory and solitonic matter
International Nuclear Information System (INIS)
Cohen, T.D.
1989-01-01
Finite density solitonic matter is considered in the context of quantum field theory. Mean-field theory, which provides a reasonable description for single-soliton properties gives rise to a crystalline description. A heuristic description of solitonic matter is given which shows that the low-density limit of solitonic matter (the limit which is presumably relevant for nuclear matter) does not commute with the mean-field theory limit and gives rise to a Fermi-gas description of the system. It is shown on the basis of a formal expansion of simple soliton models in terms of the coupling constant why one expects mean-field theory to fail at low densities and why the corrections to mean-field theory are nonperturbative. This heuristic description is tested against an exactly solvable 1+1 dimensional model (the sine-Gordon model) and found to give the correct behavior. The relevance of these results to the program of doing nuclear physics based on soliton models is discussed. (orig.)