Sample records for theory irt models

  1. Adaptive mastery testing using a multidimensional IRT model and Bayesian sequential decision theory

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Vos, Hendrik J.


    This paper focuses on a version of sequential mastery testing (i.e., classifying students as a master/nonmaster or continuing testing and administering another item or testlet) in which response behavior is modeled by a multidimensional item response theory (IRT) model. First, a general theoretical

  2. : MCID determination using IRT models


    Rouquette, Alexandra; Blanchin, Myriam; Sébille, Véronique; Guillemin, Francis; Côté, Sylvana; Falissard, Bruno; Hardouin, Jean-Benoit


    International audience; OBJECTIVES: Determining the minimal clinically important difference (MCID) of questionnaires on an interval scale, the trait level (TL) scale, using item response theory (IRT) models could overcome its association with baseline severity. The aim of this study was to compare the sensitivity (Se), specificity (Sp), and predictive values (PVs) of the MCID determined on the score scale (MCID-Sc) or the TL scale (MCID-TL). STUDY DESIGN AND SETTING: The MCID-Sc and MCID-TL o...

  3. Model Selection Methods for Mixture Dichotomous IRT Models (United States)

    Li, Feiming; Cohen, Allan S.; Kim, Seock-Ho; Cho, Sun-Joo


    This study examines model selection indices for use with dichotomous mixture item response theory (IRT) models. Five indices are considered: Akaike's information coefficient (AIC), Bayesian information coefficient (BIC), deviance information coefficient (DIC), pseudo-Bayes factor (PsBF), and posterior predictive model checks (PPMC). The five…

  4. A Multilevel Mixture IRT Model with an Application to DIF (United States)

    Cho, Sun-Joo; Cohen, Allan S.


    Mixture item response theory models have been suggested as a potentially useful methodology for identifying latent groups formed along secondary, possibly nuisance dimensions. In this article, we describe a multilevel mixture item response theory (IRT) model (MMixIRTM) that allows for the possibility that this nuisance dimensionality may function…

  5. Bayesian Estimation of the Logistic Positive Exponent IRT Model (United States)

    Bolfarine, Heleno; Bazan, Jorge Luis


    A Bayesian inference approach using Markov Chain Monte Carlo (MCMC) is developed for the logistic positive exponent (LPE) model proposed by Samejima and for a new skewed Logistic Item Response Theory (IRT) model, named Reflection LPE model. Both models lead to asymmetric item characteristic curves (ICC) and can be appropriate because a symmetric…

  6. Nonparametric Polytomous IRT Models for Invariant Item Ordering, with Results for Parametric Models. (United States)

    Sijtsma, Klaas; Hemker, Bas T.


    The absence of the invariant item ordering (IIO) property in two nonparametric polytomous item response theory (IRT) models is discussed, and two nonparametric models are discussed that imply an IIO. Only two parametric polytomous IRT models are found to imply an IIO. A method is proposed to investigate whether an IIO is implied with empirical…

  7. Analysis of Rater Agreement by Rasch and IRT Models

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm


    This chapter first discusses the use of item response theory (IRT) model for the analysis of agreement between raters in a situation where all raters have supplied dichotomous ratings of the same cases in a sample. Next, the chapter describes two approaches to the quantification of the rater...... be attributed to the considerable rater variation....

  8. Loglinear multidimensional IRT models for polytomously scired Items

    NARCIS (Netherlands)

    Kelderman, Henk


    A loglinear item response theory (IRT) model is proposed that relates polytomously scored item responses to a multidimensional latent space. Each item may have a different response function where each item response may be explained by one or more latent traits. Item response functions may follow a

  9. Comparison of Four IRT Models When Analyzing Two Tests for Inductive Reasoning. (United States)

    de Koning, Els; Sijtsma, Klaas; Hamers, Jo H. M.


    Discusses the use of the nonparametric item response theory (IRT) Mokken models of monotone homogeneity and double monotonicity and the parametric Rasch and Verhelst models for the analysis of binary test data. Concludes that the simultaneous use of several IRT models for practical data analysis provides more insight into the structure of tests…

  10. Item Response Theory with Covariates (IRT-C) : Assessing item recovery and differential item functioning for the three-parameter logistic model

    NARCIS (Netherlands)

    Tay, L.; Huang, Q.; Vermunt, J.K.


    In large-scale testing, the use of multigroup approaches is limited for assessing differential item functioning (DIF) across multiple variables as DIF is examined for each variable separately. In contrast, the item response theory with covariate (IRT-C) procedure can be used to examine DIF across

  11. Assessing the item response theory with covariate (IRT-C) procedure for ascertaining differential item functioning

    NARCIS (Netherlands)

    Tay, L.; Vermunt, J.K.; Wang, C.


    We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform

  12. Stochastic Ordering Using the Latent Trait and the Sum Score in Polytomous IRT Models. (United States)

    Hemker, Bas T.; Sijtsma, Klaas; Molenaar, Ivo W.; Junker, Brian W.


    Stochastic ordering properties are investigated for a broad class of item response theory (IRT) models for which the monotone likelihood ratio does not hold. A taxonomy is given for nonparametric and parametric models for polytomous models based on the hierarchical relationship between the models. (SLD)

  13. Stochastic EM for estimating the parameters of a multilevel IRT model

    NARCIS (Netherlands)

    Fox, Gerardus J.A.


    An item response theory (IRT) model is used as a measurement error model for the dependent variable of a multilevel model where tests or questionnaires consisting of separate items are used to perform a measurement error analysis. The advantage of using latent scores as dependent variables of a

  14. Polytomous IRT models and monotone likelihood ratio of the total score

    NARCIS (Netherlands)

    Hemker, BT; Sijtsma, Klaas; Molenaar, Ivo W; Junker, BW


    In a broad class of item response theory (IRT) models for dichotomous items the unweighted total score has monotone likelihood ratio (MLR) in the latent trait theta. In this study, it is shown that for polytomous items MLR holds for the partial credit model and a trivial generalization of this

  15. An EM Approach to Parameter Estimation for the Zinnes and Griggs Paired Comparison IRT Model. (United States)

    Stark, Stephen; Drasgow, Fritz


    Describes item response and information functions for the Zinnes and Griggs paired comparison item response theory (IRT) model (1974) and presents procedures for estimating stimulus and person parameters. Monte Carlo simulations show that at least 400 ratings are required to obtain reasonably accurate estimates of the stimulus parameters and their…

  16. Comparing Three Estimation Methods for the Three-Parameter Logistic IRT Model (United States)

    Lamsal, Sunil


    Different estimation procedures have been developed for the unidimensional three-parameter item response theory (IRT) model. These techniques include the marginal maximum likelihood estimation, the fully Bayesian estimation using Markov chain Monte Carlo simulation techniques, and the Metropolis-Hastings Robbin-Monro estimation. With each…

  17. Extended Rasch Modeling: The eRm Package for the Application of IRT Models in R

    Directory of Open Access Journals (Sweden)

    Patrick Mair


    Full Text Available Item response theory models (IRT are increasingly becoming established in social science research, particularly in the analysis of performance or attitudinal data in psychology, education, medicine, marketing and other fields where testing is relevant. We propose the R package eRm (extended Rasch modeling for computing Rasch models and several extensions. A main characteristic of some IRT models, the Rasch model being the most prominent, concerns the separation of two kinds of parameters, one that describes qualities of the subject under investigation, and the other relates to qualities of the situation under which the response of a subject is observed. Using conditional maximum likelihood (CML estimation both types of parameters may be estimated independently from each other. IRT models are well suited to cope with dichotomous and polytomous responses, where the response categories may be unordered as well as ordered. The incorporation of linear structures allows for modeling the effects of covariates and enables the analysis of repeated categorical measurements. The eRm package fits the following models: the Rasch model, the rating scale model (RSM, and the partial credit model (PCM as well as linear reparameterizations through covariate structures like the linear logistic test model (LLTM, the linear rating scale model (LRSM, and the linear partial credit model (LPCM. We use an unitary, efficient CML approach to estimate the item parameters and their standard errors. Graphical and numeric tools for assessing goodness-of-fit are provided.

  18. A generalized longitudinal mixture IRT model for measuring differential growth in learning environments. (United States)

    Kadengye, Damazo T; Ceulemans, Eva; Van den Noortgate, Wim


    This article describes a generalized longitudinal mixture item response theory (IRT) model that allows for detecting latent group differences in item response data obtained from electronic learning (e-learning) environments or other learning environments that result in large numbers of items. The described model can be viewed as a combination of a longitudinal Rasch model, a mixture Rasch model, and a random-item IRT model, and it includes some features of the explanatory IRT modeling framework. The model assumes the possible presence of latent classes in item response patterns, due to initial person-level differences before learning takes place, to latent class-specific learning trajectories, or to a combination of both. Moreover, it allows for differential item functioning over the classes. A Bayesian model estimation procedure is described, and the results of a simulation study are presented that indicate that the parameters are recovered well, particularly for conditions with large item sample sizes. The model is also illustrated with an empirical sample data set from a Web-based e-learning environment.

  19. Assessing fit of alternative unidimensional polytomous IRT models using posterior predictive model checking. (United States)

    Li, Tongyun; Xie, Chao; Jiao, Hong


    This article explored the application of the posterior predictive model checking (PPMC) method in assessing fit for unidimensional polytomous item response theory (IRT) models, specifically the divide-by-total models (e.g., the generalized partial credit model). Previous research has primarily focused on using PPMC in model checking for unidimensional and multidimensional IRT models for dichotomous data, and has paid little attention to polytomous models. A Monte Carlo simulation was conducted to investigate the performance of PPMC in detecting different sources of misfit for the partial credit model family. Results showed that the PPMC method, in combination with appropriate discrepancy measures, had adequate power in detecting different sources of misfit for the partial credit model family. Global odds ratio and item total correlation exhibited specific patterns in detecting the absence of the slope parameter, whereas Yen's Q1 was found to be promising in the detection of misfit caused by the constant category intersection parameter constraint across items. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. A Comparison of Three IRT Approaches to Examinee Ability Change Modeling in a Single-Group Anchor Test Design (United States)

    Paek, Insu; Park, Hyun-Jeong; Cai, Li; Chi, Eunlim


    Typically a longitudinal growth modeling based on item response theory (IRT) requires repeated measures data from a single group with the same test design. If operational or item exposure problems are present, the same test may not be employed to collect data for longitudinal analyses and tests at multiple time points are constructed with unique…

  1. A GPU-Based Gibbs Sampler for a Unidimensional IRT Model. (United States)

    Sheng, Yanyan; Welling, William S; Zhu, Michelle M


    Item response theory (IRT) is a popular approach used for addressing large-scale statistical problems in psychometrics as well as in other fields. The fully Bayesian approach for estimating IRT models is usually memory and computationally expensive due to the large number of iterations. This limits the use of the procedure in many applications. In an effort to overcome such restraint, previous studies focused on utilizing the message passing interface (MPI) in a distributed memory-based Linux cluster to achieve certain speedups. However, given the high data dependencies in a single Markov chain for IRT models, the communication overhead rapidly grows as the number of cluster nodes increases. This makes it difficult to further improve the performance under such a parallel framework. This study aims to tackle the problem using massive core-based graphic processing units (GPU), which is practical, cost-effective, and convenient in actual applications. The performance comparisons among serial CPU, MPI, and compute unified device architecture (CUDA) programs demonstrate that the CUDA GPU approach has many advantages over the CPU-based approach and therefore is preferred.

  2. An IRT model with a parameter-driven process for change

    NARCIS (Netherlands)

    Rijmen, F.; de Boeck, P.; van der Maas, H.L.J.


    An IRT model with a parameter-driven process for change is proposed. Quantitative differences between persons are taken into account by a continuous latent variable, as in common IRT models. In addition, qualitative interindividual differences and autodependencies are accounted for by assuming

  3. A Review of PROC IRT in SAS (United States)

    Choi, Jinnie


    This article reviews PROC IRT, which was added to Statistical Analysis Software in 2014. We provide an introductory overview of a free version of SAS, describe what PROC IRT offers for item response theory (IRT) analysis and how one can use PROC IRT, and discuss how other SAS macros and procedures may compensate the IRT functionalities of PROC IRT.

  4. Evaluation of Buss-Perry aggression Questionnaire with item response theory (IRT

    Directory of Open Access Journals (Sweden)

    Dinić Bojana


    Full Text Available The aim of this research was to examine the psychometric properties of the Buss-Perry Aggression Questionnaire on Serbian sample, using the IRT model for graded responses. AQ contains four subscales: Physical aggression, Verbal aggression, Hostility and Anger. The sample included 1272 participants, both gender and age ranged from 18 to 68 years, with average age of 31.39 (SD = 12.63 years. Results of IRT analysis suggested that the subscales had greater information in the range of above-average scores, namely in participants with higher level of aggressiveness. The exception was Hostilisty subscale, because it was informative in the wider range of trait. On the other hand, this subscale contains two items which violate assumption of homogenity. Implications for measurement of aggressiveness are discussed.

  5. The Motivational Value Systems Questionnaire (MVSQ: Psychometric Analysis Using a Forced Choice Thurstonian IRT Model

    Directory of Open Access Journals (Sweden)

    Josef Merk


    Full Text Available This study presents a new measure of value systems, the Motivational Value Systems Questionnaire (MVSQ, which is based on a theory of value systems by psychologist Clare W. Graves. The purpose of the instrument is to help people identify their personal hierarchies of value systems and thus become more aware of what motivates and demotivates them in work-related contexts. The MVSQ is a forced-choice (FC measure, making it quicker to complete and more difficult to intentionally distort, but also more difficult to assess its psychometric properties due to ipsativity of FC data compared to rating scales. To overcome limitations of ipsative data, a Thurstonian IRT (TIRT model was fitted to the questionnaire data, based on a broad sample of N = 1,217 professionals and students. Comparison of normative (IRT scale scores and ipsative scores suggested that MVSQ IRT scores are largely freed from restrictions due to ipsativity and thus allow interindividual comparison of scale scores. Empirical reliability was estimated using a sample-based simulation approach which showed acceptable and good estimates and, on average, slightly higher test-retest reliabilities. Further, validation studies provided evidence on both construct validity and criterion-related validity. Scale score correlations and associations of scores with both age and gender were largely in line with theoretically- and empirically-based expectations, and results of a multitrait-multimethod analysis supports convergent and discriminant construct validity. Criterion validity was assessed by examining the relation of value system preferences to departmental affiliation which revealed significant relations in line with prior hypothesizing. These findings demonstrate the good psychometric properties of the MVSQ and support its application in the assessment of value systems in work-related contexts.

  6. A MATLAB Package for Markov Chain Monte Carlo with a Multi-Unidimensional IRT Model

    Directory of Open Access Journals (Sweden)

    Yanyan Sheng


    Full Text Available Unidimensional item response theory (IRT models are useful when each item is designed to measure some facet of a unified latent trait. In practical applications, items are not necessarily measuring the same underlying trait, and hence the more general multi-unidimensional model should be considered. This paper provides the requisite information and description of software that implements the Gibbs sampler for such models with two item parameters and a normal ogive form. The software developed is written in the MATLAB package IRTmu2no. The package is flexible enough to allow a user the choice to simulate binary response data with multiple dimensions, set the number of total or burn-in iterations, specify starting values or prior distributions for model parameters, check convergence of the Markov chain, as well as obtain Bayesian fit statistics. Illustrative examples are provided to demonstrate and validate the use of the software package.

  7. The Utility of IRT in Small-Sample Testing Applications. (United States)

    Sireci, Stephen G.

    The utility of modified item response theory (IRT) models in small sample testing applications was studied. The modified IRT models were modifications of the one- and two-parameter logistic models. One-, two-, and three-parameter models were also studied. Test data were from 4 years of a national certification examination for persons desiring…

  8. A person fit test for IRT models for polytomous items

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Dagohoy, A.V.


    A person fit test based on the Lagrange multiplier test is presented for three item response theory models for polytomous items: the generalized partial credit model, the sequential model, and the graded response model. The test can also be used in the framework of multidimensional ability

  9. IRT Item Parameter Recovery with Marginal Maximum Likelihood Estimation Using Loglinear Smoothing Models (United States)

    Casabianca, Jodi M.; Lewis, Charles


    Loglinear smoothing (LLS) estimates the latent trait distribution while making fewer assumptions about its form and maintaining parsimony, thus leading to more precise item response theory (IRT) item parameter estimates than standard marginal maximum likelihood (MML). This article provides the expectation-maximization algorithm for MML estimation…

  10. An Introduction to Item Response Theory and Rasch Models for Speech-Language Pathologists (United States)

    Baylor, Carolyn; Hula, William; Donovan, Neila J.; Doyle, Patrick J.; Kendall, Diane; Yorkston, Kathryn


    Purpose: To present a primarily conceptual introduction to item response theory (IRT) and Rasch models for speech-language pathologists (SLPs). Method: This tutorial introduces SLPs to basic concepts and terminology related to IRT as well as the most common IRT models. The article then continues with an overview of how instruments are developed…

  11. Re-evaluating a vision-related quality of life questionnaire with item response theory (IRT) and differential item functioning (DIF) analyses.

    NARCIS (Netherlands)

    Nispen, R.M.A. van; Knol, D.L.; Langelaan, M.; Rens, G.H.M.B. van


    Background: For the Low Vision Quality Of Life questionnaire (LVQOL) it is unknown whether the psychometric properties are satisfactory when an item response theory (IRT) perspective is considered. This study evaluates some essential psychometric properties of the LVQOL questionnaire in an IRT

  12. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    NARCIS (Netherlands)

    M.G. de Jong (Martijn); J-B.E.M. Steenkamp (Jan-Benedict)


    textabstractWe present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups

  13. Examining the Effectiveness of Test Accommodation Using DIF and a Mixture IRT Model (United States)

    Cho, Hyun-Jeong; Lee, Jaehoon; Kingston, Neal


    This study examined the validity of test accommodation in third-eighth graders using differential item functioning (DIF) and mixture IRT models. Two data sets were used for these analyses. With the first data set (N = 51,591) we examined whether item type (i.e., story, explanation, straightforward) or item features were associated with item…

  14. Detecting Intervention Effects Using a Multilevel Latent Transition Analysis with a Mixture IRT Model (United States)

    Cho, Sun-Joo; Cohen, Allan S.; Bottge, Brian


    A multilevel latent transition analysis (LTA) with a mixture IRT measurement model (MixIRTM) is described for investigating the effectiveness of an intervention. The addition of a MixIRTM to the multilevel LTA permits consideration of both potential heterogeneity in students' response to instructional intervention as well as a methodology for…

  15. Mathematical Ability and Socio-Economic Background: IRT Modeling to Estimate Genotype by Environment Interaction. (United States)

    Schwabe, Inga; Boomsma, Dorret I; van den Berg, Stéphanie M


    Genotype by environment interaction in behavioral traits may be assessed by estimating the proportion of variance that is explained by genetic and environmental influences conditional on a measured moderating variable, such as a known environmental exposure. Behavioral traits of interest are often measured by questionnaires and analyzed as sum scores on the items. However, statistical results on genotype by environment interaction based on sum scores can be biased due to the properties of a scale. This article presents a method that makes it possible to analyze the actually observed (phenotypic) item data rather than a sum score by simultaneously estimating the genetic model and an item response theory (IRT) model. In the proposed model, the estimation of genotype by environment interaction is based on an alternative parametrization that is uniquely identified and therefore to be preferred over standard parametrizations. A simulation study shows good performance of our method compared to analyzing sum scores in terms of bias. Next, we analyzed data of 2,110 12-year-old Dutch twin pairs on mathematical ability. Genetic models were evaluated and genetic and environmental variance components estimated as a function of a family's socio-economic status (SES). Results suggested that common environmental influences are less important in creating individual differences in mathematical ability in families with a high SES than in creating individual differences in mathematical ability in twin pairs with a low or average SES.

  16. MCMC estimation and some fit analysis of multidimensional IRT models

    NARCIS (Netherlands)

    Beguin, Anton; Glas, Cornelis A.W.


    A Bayesian procedure to estimate the three-parameter normal ogive model and a generalization of the procedure to a model with multidimensional ability parameters are presented. The procedure is a generalization of a procedure by Albert (1992) for estimating the two-parameter normal ogive model. The

  17. An introduction to item response theory and Rasch models for speech-language pathologists. (United States)

    Baylor, Carolyn; Hula, William; Donovan, Neila J; Doyle, Patrick J; Kendall, Diane; Yorkston, Kathryn


    To present a primarily conceptual introduction to item response theory (IRT) and Rasch models for speech-language pathologists (SLPs). This tutorial introduces SLPs to basic concepts and terminology related to IRT as well as the most common IRT models. The article then continues with an overview of how instruments are developed using IRT and some basic principles of adaptive testing. IRT is a set of statistical methods that are increasingly used for developing instruments in speech-language pathology. While IRT is not new, its application in speech-language pathology to date has been relatively limited in scope. Several new IRT-based instruments are currently emerging. IRT differs from traditional methods for test development, typically referred to as classical test theory (CTT), in several theoretical and practical ways. Administration, scoring, and interpretation of IRT instruments are different from methods used for most traditional CTT instruments. SLPs will need to understand the basic concepts of IRT instruments to use these tools in their clinical and research work. This article provides an introduction to IRT concepts drawing on examples from speech-language pathology.

  18. Evaluation of the Hospital Anxiety and Depression Scale (HADS) in screening stroke patients for symptoms: Item Response Theory (IRT) analysis. (United States)

    Ayis, Salma A; Ayerbe, Luis; Ashworth, Mark; DA Wolfe, Charles


    Variations have been reported in the number of underlying constructs and choice of thresholds that determine caseness of anxiety and /or depression using the Hospital Anxiety and Depression scale (HADS). This study examined the properties of each item of HADS as perceived by stroke patients, and assessed the information these items convey about anxiety and depression between 3 months to 5 years after stroke. The study included 1443 stroke patients from the South London Stroke Register (SLSR). The dimensionality of HADS was examined using factor analysis methods, and items' properties up to 5 years after stroke were tested using Item Response Theory (IRT) methods, including graded response models (GRMs). The presence of two dimensions of HADS (anxiety and depression) for stroke patients was confirmed. Items that accurately inferred about the severity of anxiety and depression, and offered good discrimination of caseness were identified as "I can laugh and see the funny side of things" (Q4) and "I get sudden feelings of panic" (Q13), discrimination 2.44 (se = 0.26), and 3.34 (se = 0.35), respectively. Items that shared properties, hence replicate inference were: "I get a sort of frightened feeling as if something awful is about to happen" (Q3), "I get a sort of frightened feeling like butterflies in my stomach" (Q6), and "Worrying thoughts go through my mind" (Q9). Item properties were maintained over time. Approximately 20% of patients were lost to follow up. A more concise selection of items based on their properties, would provide a precise approach for screening patients and for an optimal allocation of patients into clinical trials. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Item level diagnostics and model - data fit in item response theory ...

    African Journals Online (AJOL)

    ... when using BILOG-MG V3.0. Five items fitted 2-parameter models in IRTPRO. It was recommended that the use of more than one IRT software programme offers more useful information for the choice of model that fit the data. KEYWORDS: Item Level, Diagnostics, Statistics, Model - Data Fit, Item Response Theory (IRT).

  20. Re-evaluating a vision-related quality of life questionnaire with item response theory (IRT) and differential item functioning (DIF) analyses. (United States)

    van Nispen, Ruth M A; Knol, Dirk L; Langelaan, Maaike; van Rens, Ger H M B


    For the Low Vision Quality Of Life questionnaire (LVQOL) it is unknown whether the psychometric properties are satisfactory when an item response theory (IRT) perspective is considered. This study evaluates some essential psychometric properties of the LVQOL questionnaire in an IRT model, and investigates differential item functioning (DIF). Cross-sectional data were used from an observational study among visually-impaired patients (n = 296). Calibration was performed for every dimension of the LVQOL in the graded response model. Item goodness-of-fit was assessed with the S-X(2)-test. DIF was assessed on relevant background variables (i.e. age, gender, visual acuity, eye condition, rehabilitation type and administration type) with likelihood-ratio tests for DIF. The magnitude of DIF was interpreted by assessing the largest difference in expected scores between subgroups. Measurement precision was assessed by presenting test information curves; reliability with the index of subject separation. All items of the LVQOL dimensions fitted the model. There was significant DIF on several items. For two items the maximum difference between expected scores exceeded one point, and DIF was found on multiple relevant background variables. Item 1 'Vision in general' from the "Adjustment" dimension and item 24 'Using tools' from the "Reading and fine work" dimension were removed. Test information was highest for the "Reading and fine work" dimension. Indices for subject separation ranged from 0.83 to 0.94. The items of the LVQOL showed satisfactory item fit to the graded response model; however, two items were removed because of DIF. The adapted LVQOL with 21 items is DIF-free and therefore seems highly appropriate for use in heterogeneous populations of visually impaired patients.

  1. Re-evaluating a vision-related quality of life questionnaire with item response theory (IRT and differential item functioning (DIF analyses

    Directory of Open Access Journals (Sweden)

    Knol Dirk L


    Full Text Available Abstract Background For the Low Vision Quality Of Life questionnaire (LVQOL it is unknown whether the psychometric properties are satisfactory when an item response theory (IRT perspective is considered. This study evaluates some essential psychometric properties of the LVQOL questionnaire in an IRT model, and investigates differential item functioning (DIF. Methods Cross-sectional data were used from an observational study among visually-impaired patients (n = 296. Calibration was performed for every dimension of the LVQOL in the graded response model. Item goodness-of-fit was assessed with the S-X2-test. DIF was assessed on relevant background variables (i.e. age, gender, visual acuity, eye condition, rehabilitation type and administration type with likelihood-ratio tests for DIF. The magnitude of DIF was interpreted by assessing the largest difference in expected scores between subgroups. Measurement precision was assessed by presenting test information curves; reliability with the index of subject separation. Results All items of the LVQOL dimensions fitted the model. There was significant DIF on several items. For two items the maximum difference between expected scores exceeded one point, and DIF was found on multiple relevant background variables. Item 1 'Vision in general' from the "Adjustment" dimension and item 24 'Using tools' from the "Reading and fine work" dimension were removed. Test information was highest for the "Reading and fine work" dimension. Indices for subject separation ranged from 0.83 to 0.94. Conclusions The items of the LVQOL showed satisfactory item fit to the graded response model; however, two items were removed because of DIF. The adapted LVQOL with 21 items is DIF-free and therefore seems highly appropriate for use in heterogeneous populations of visually impaired patients.

  2. Pretest-Posttest-Posttest Multilevel IRT Modeling of Competence Growth of Students in Higher Education in Germany

    NARCIS (Netherlands)

    Schmidt, Susanne; Zlatkin-Troitschanskaia, Olga; Fox, Gerardus J.A.


    Longitudinal research in higher education faces several challenges. Appropriate methods of analyzing competence growth of students are needed to deal with those challenges and thereby obtain valid results. In this article, a pretest-posttest-posttest multivariate multilevel IRT model for repeated

  3. An empirical comparison of Item Response Theory and Classical Test Theory

    Directory of Open Access Journals (Sweden)

    Špela Progar


    Full Text Available Based on nonlinear models between the measured latent variable and the item response, item response theory (IRT enables independent estimation of item and person parameters and local estimation of measurement error. These properties of IRT are also the main theoretical advantages of IRT over classical test theory (CTT. Empirical evidence, however, often failed to discover consistent differences between IRT and CTT parameters and between invariance measures of CTT and IRT parameter estimates. In this empirical study a real data set from the Third International Mathematics and Science Study (TIMSS 1995 was used to address the following questions: (1 How comparable are CTT and IRT based item and person parameters? (2 How invariant are CTT and IRT based item parameters across different participant groups? (3 How invariant are CTT and IRT based item and person parameters across different item sets? The findings indicate that the CTT and the IRT item/person parameters are very comparable, that the CTT and the IRT item parameters show similar invariance property when estimated across different groups of participants, that the IRT person parameters are more invariant across different item sets, and that the CTT item parameters are at least as much invariant in different item sets as the IRT item parameters. The results furthermore demonstrate that, with regards to the invariance property, IRT item/person parameters are in general empirically superior to CTT parameters, but only if the appropriate IRT model is used for modelling the data.

  4. Moderated Multiple Regression, Spurious Interaction Effects, and IRT (United States)

    Kang, Sun-Mee; Waller, Niels G.


    Two Monte Carlo studies were conducted to explore the Type I error rates in moderated multiple regression (MMR) of observed scores and estimated latent trait scores from a two-parameter logistic item response theory (IRT) model. The results of both studies showed that MMR Type I error rates were substantially higher than the nominal alpha levels…

  5. IRT-Estimated Reliability for Tests Containing Mixed Item Formats (United States)

    Shu, Lianghua; Schwarz, Richard D.


    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  6. Investigating the Impact of Item Parameter Drift for Item Response Theory Models with Mixture Distributions

    Directory of Open Access Journals (Sweden)

    Yoon Soo ePark


    Full Text Available This study investigates the impact of item parameter drift (IPD on parameter and ability estimation when the underlying measurement model fits a mixture distribution, thereby violating the item invariance property of unidimensional item response theory (IRT models. An empirical study was conducted to demonstrate the occurrence of both IPD and an underlying mixture distribution using real-world data. Twenty-one trended anchor items from the 1999, 2003, and 2007 administrations of Trends in International Mathematics and Science Study (TIMSS were analyzed using unidimensional and mixture IRT models. TIMSS treats trended anchor items as invariant over testing administrations and uses pre-calibrated item parameters based on unidimensional IRT. However, empirical results showed evidence of two latent subgroups with IPD. Results showed changes in the distribution of examinee ability between latent classes over the three administrations. A simulation study was conducted to examine the impact of IPD on the estimation of ability and item parameters, when data have underlying mixture distributions. Simulations used data generated from a mixture IRT model and estimated using unidimensional IRT. Results showed that data reflecting IPD using mixture IRT model led to IPD in the unidimensional IRT model. Changes in the distribution of examinee ability also affected item parameters. Moreover, drift with respect to item discrimination and distribution of examinee ability affected estimates of examinee ability. These findings demonstrate the need to caution and evaluate IPD using a mixture IRT framework to understand its effect on item parameters and examinee ability.

  7. Estimation of a Ramsay-Curve Item Response Theory Model by the Metropolis-Hastings Robbins-Monro Algorithm (United States)

    Monroe, Scott; Cai, Li


    In Ramsay curve item response theory (RC-IRT) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's EM algorithm, which yields maximum marginal likelihood estimates. This method, however, does not produce the…

  8. Model theory

    CERN Document Server

    Chang, CC


    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  9. Knowledge of Solution Strategies and IRT Modeling of Items for Transitive Reasoning. (United States)

    Sijtsma, Klaas; Verweij, Anton C.


    Presents componential item response theory as a model-oriented approach to studying processes and strategies underlying the incorrect/correct responses to cognitive test tasks. Results from 417 elementary school students show that combining knowledge of solution strategies with Item Response Theory modeling produced a useful unidimensional scale…

  10. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models (United States)

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa


    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  11. IRTPRO 2.1 for Windows (Item Response Theory for Patient-Reported Outcomes) (United States)

    Paek, Insu; Han, Kyung T.


    This article reviews a new item response theory (IRT) model estimation program, IRTPRO 2.1, for Windows that is capable of unidimensional and multidimensional IRT model estimation for existing and user-specified constrained IRT models for dichotomously and polytomously scored item response data. (Contains 1 figure and 2 notes.)

  12. Latent Variable Modelling and Item Response Theory Analyses in Marketing Research

    Directory of Open Access Journals (Sweden)

    Brzezińska Justyna


    Full Text Available Item Response Theory (IRT is a modern statistical method using latent variables designed to model the interaction between a subject’s ability and the item level stimuli (difficulty, guessing. Item responses are treated as the outcome (dependent variables, and the examinee’s ability and the items’ characteristics are the latent predictor (independent variables. IRT models the relationship between a respondent’s trait (ability, attitude and the pattern of item responses. Thus, the estimation of individual latent traits can differ even for two individuals with the same total scores. IRT scores can yield additional benefits and this will be discussed in detail. In this paper theory and application with R software with the use of packages designed for modelling IRT will be presented.

  13. Model theory

    CERN Document Server

    Hodges, Wilfrid


    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  14. On the Relationships between Jeffreys Modal and Weighted Likelihood Estimation of Ability under Logistic IRT Models (United States)

    Magis, David; Raiche, Gilles


    This paper focuses on two estimators of ability with logistic item response theory models: the Bayesian modal (BM) estimator and the weighted likelihood (WL) estimator. For the BM estimator, Jeffreys' prior distribution is considered, and the corresponding estimator is referred to as the Jeffreys modal (JM) estimator. It is established that under…

  15. A Method for Investigating the Intersection of Item Response Functions in Mokken's Nonparametric IRT Model. (United States)

    Sijtsma, Klaas; Meijer, Rob R.


    A method is proposed for investigating the intersection of item response functions in the nonparametric item-response-theory model of R. J. Mokken (1971). Results from a Monte Carlo study support the proposed use of the transposed data matrix H(sup T) as an extension to Mokken's approach. (SLD)

  16. Estimation of a Ramsay-Curve Item Response Theory Model by the Metropolis-Hastings Robbins-Monro Algorithm. CRESST Report 834 (United States)

    Monroe, Scott; Cai, Li


    In Ramsay curve item response theory (RC-IRT, Woods & Thissen, 2006) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's (1981) EM algorithm, which yields maximum marginal likelihood estimates. This method, however,…

  17. A modular approach for item response theory modeling with the R package flirt. (United States)

    Jeon, Minjeong; Rijmen, Frank


    The new R package flirt is introduced for flexible item response theory (IRT) modeling of psychological, educational, and behavior assessment data. flirt integrates a generalized linear and nonlinear mixed modeling framework with graphical model theory. The graphical model framework allows for efficient maximum likelihood estimation. The key feature of flirt is its modular approach to facilitate convenient and flexible model specifications. Researchers can construct customized IRT models by simply selecting various modeling modules, such as parametric forms, number of dimensions, item and person covariates, person groups, link functions, etc. In this paper, we describe major features of flirt and provide examples to illustrate how flirt works in practice.

  18. Multi-level IRT with measurement error in the predictor variables

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.


    A two-level regression model is imposed on the ability parameters in an item response theory (IRT) model. The advantage of using latent rather than observed scores as dependent variables of a multilevel model is that this offers the possibility of separating the influence of item difficulty and

  19. Nonignorable data in IRT models: Polytomous responses and response propensity models with covariates

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Pimentel, Jonald; Lamers, S.M.A.


    Missing data usually present special problems for statistical analyses, especially when the data are not missing at random, that is, when the ignorability principle defined by Rubin (1976) does not hold. Recently, a substantial number of articles have been published on model-based procedures to

  20. The psychometric properties of the "Reading the Mind in the Eyes" Test: an item response theory (IRT) analysis. (United States)

    Preti, Antonio; Vellante, Marcello; Petretto, Donatella R


    The "Reading the Mind in the Eyes" Test (hereafter: Eyes Test) is considered an advanced task of the Theory of Mind aimed at assessing the performance of the participant in perspective-takingthat is, the ability to sense or understand other people's cognitive and emotional states. In this study, the item response theory analysis was applied to the adult version of the Eyes Test. The Italian version of the Eyes Test was administered to 200 undergraduate students of both genders (males = 46%). Modified parallel analysis (MPA) was used to test unidimensionality. Marginal maximum likelihood estimation was used to fit the 1-, 2-, and 3-parameter logistic (PL) model to the data. Differential Item Functioning (DIF) due to gender was explored with five independent methods. MPA provided evidence in favour of unidimensionality. The Rasch model (1-PL) was superior to the other two models in explaining participants' responses to the Eyes Test. There was no robust evidence of gender-related DIF in the Eyes Test, although some differences may exist for some items as a reflection of real differences by group. The study results support a one-factor model of the Eyes Test. Performance on the Eyes Test is defined by the participant's ability in perspective-taking. Researchers should cease using arbitrarily selected subscores in assessing the performance of participants to the Eyes Test. Lack of gender-related DIF favours the use of the Eyes Test in the investigation of gender differences concerning empathy and social cognition.


    Directory of Open Access Journals (Sweden)

    Alexander K. Volkov


    Full Text Available The modern approaches to the aviation security screeners’ efficiency have been analyzedand, certain drawbacks have been considered. The main drawback is the complexity of ICAO recommendations implementation concerning taking into account of shadow x-ray image complexity factors during preparation and evaluation of prohibited items detection efficiency by aviation security screeners. Х-ray image based factors are the specific properties of the x-ray image that in- fluence the ability to detect prohibited items by aviation security screeners. The most important complexity factors are: geometric characteristics of a prohibited item; view difficulty of prohibited items; superposition of prohibited items byother objects in the bag; bag content complexity; the color similarity of prohibited and usual items in the luggage.The one-dimensional two-parameter IRT model and the related criterion of aviation security screeners’ qualification have been suggested. Within the suggested model the probabilistic detection characteristics of aviation security screeners are considered as functions of such parameters as the difference between level of qualification and level of x-ray images com- plexity, and also between the aviation security screeners’ responsibility and structure of their professional knowledge. On the basis of the given model it is possible to consider two characteristic functions: first of all, characteristic function of qualifica- tion level which describes multi-complexity level of x-ray image interpretation competency of the aviation security screener; secondly, characteristic function of the x-ray image complexity which describes the range of x-ray image interpretation com- petency of the aviation security screeners having various training levels to interpret the x-ray image of a certain level of com- plexity. The suggested complex criterion to assess the level of the aviation security screener qualification allows to evaluate his or

  2. Detecting DIF in Polytomous Items Using MACS, IRT and Ordinal Logistic Regression (United States)

    Elosua, Paula; Wells, Craig


    The purpose of the present study was to compare the Type I error rate and power of two model-based procedures, the mean and covariance structure model (MACS) and the item response theory (IRT), and an observed-score based procedure, ordinal logistic regression, for detecting differential item functioning (DIF) in polytomous items. A simulation…

  3. Fitting a Thurstonian IRT model to forced-choice data using Mplus. (United States)

    Brown, Anna; Maydeu-Olivares, Alberto


    To counter response distortions associated with the use of rating scales (a.k.a. Likert scales), items can be presented in a comparative fashion, so that respondents are asked to rank the items within blocks (forced-choice format). However, classical scoring procedures for these forced-choice designs lead to ipsative data, which presents psychometric challenges that are well described in the literature. Recently, Brown and Maydeu-Olivares (Educational and Psychological Measurement 71: 460-502, 2011a) introduced a model based on Thurstone's law of comparative judgment, which overcomes the problems of ipsative data. Here, we provide a step-by-step tutorial for coding forced-choice responses, specifying a Thurstonian item response theory model that is appropriate for the design used, assessing the model's fit, and scoring individuals on psychological attributes. Estimation and scoring is performed using Mplus, and a very straightforward Excel macro is provided that writes full Mplus input files for any forced-choice design. Armed with these tools, using a forced-choice design is now as easy as using ratings.

  4. Using IRT Trait Estimates versus Summated Scores in Predicting Outcomes (United States)

    Xu, Ting; Stone, Clement A.


    It has been argued that item response theory trait estimates should be used in analyses rather than number right (NR) or summated scale (SS) scores. Thissen and Orlando postulated that IRT scaling tends to produce trait estimates that are linearly related to the underlying trait being measured. Therefore, IRT trait estimates can be more useful…

  5. Testing Measurement Invariance: A Comparison of Multiple-Group Categorical CFA and IRT (United States)

    Kim, Eun Sook; Yoon, Myeongsun


    This study investigated two major approaches in testing measurement invariance for ordinal measures: multiple-group categorical confirmatory factor analysis (MCCFA) and item response theory (IRT). Unlike the ordinary linear factor analysis, MCCFA can appropriately model the ordered-categorical measures with a threshold structure. A simulation…

  6. A Comparative Study of Test Data Dimensionality Assessment Procedures Under Nonparametric IRT Models (United States)

    van Abswoude, Alexandra A. H.; van der Ark, L. Andries; Sijtsma, Klaas


    In this article, an overview of nonparametric item response theory methods for determining the dimensionality of item response data is provided. Four methods were considered: MSP, DETECT, HCA/CCPROX, and DIMTEST. First, the methods were compared theoretically. Second, a simulation study was done to compare the effectiveness of MSP, DETECT, and…

  7. Reliability estimation for single dichotomous items based on Mokken's IRT model

    NARCIS (Netherlands)

    Meijer, R R; Sijtsma, K; Molenaar, Ivo W


    Item reliability is of special interest for Mokken's nonparametric item response theory, and is useful for the evaluation of item quality in nonparametric test construction research. It is also of interest for nonparametric person-fit analysis. Three methods for the estimation of the reliability of

  8. Reliability estimation for single dichotomous items based on Mokken's IRT model

    NARCIS (Netherlands)

    Meijer, R.R.; Sijtsma, Klaas; Molenaar, Ivo W.


    Item reliability is of special interest for Mokken’s nonparametric item response theory, and is useful for the evaluation of item quality in nonparametric test construction research. It is also of interest for nonparametric person-fit analysis. Three methods for the estimation of the reliability of

  9. An Investigation of Invariance Properties of One, Two and Three Parameter Logistic Item Response Theory Models

    Directory of Open Access Journals (Sweden)

    O.A. Awopeju


    Full Text Available The study investigated the invariance properties of one, two and three parame-ter logistic item response theory models. It examined the best fit among one parameter logistic (1PL, two-parameter logistic (2PL and three-parameter logistic (3PL IRT models for SSCE, 2008 in Mathematics. It also investigated the degree of invariance of the IRT models based item difficulty parameter estimates in SSCE in Mathematics across different samples of examinees and examined the degree of invariance of the IRT models based item discrimination estimates in SSCE in Mathematics across different samples of examinees. In order to achieve the set objectives, 6000 students (3000 males and 3000 females were drawn from the population of 35262 who wrote the 2008 paper 1 Senior Secondary Certificate Examination (SSCE in Mathematics organized by National Examination Council (NECO. The item difficulty and item discrimination parameter estimates from CTT and IRT were tested for invariance using BLOG MG 3 and correlation analysis was achieved using SPSS version 20. The research findings were that two parameter model IRT item difficulty and discrimination parameter estimates exhibited invariance property consistently across different samples and that 2-parameter model was suitable for all samples of examinees unlike one-parameter model and 3-parameter model.

  10. Misplacement of Concepts and Administrative Theory. (United States)

    Ramos, Alberto Guerreiro


    Administrative theory will become characterless and crippled if it continues to indulge in the practice of unqualified borrowing from other disciplines, theories, models, and concepts alien to its specific task. (Author/IRT)

  11. Bayesian modeling of measurement error in predictor variables using item response theory

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.


    This paper focuses on handling measurement error in predictor variables using item response theory (IRT). Measurement error is of great important in assessment of theoretical constructs, such as intelligence or the school climate. Measurement error is modeled by treating the predictors as unobserved

  12. Comparing countdown- and IRT-based approaches to computerized adaptive personality testing. (United States)

    Rudick, Monica M; Yam, Wern How; Simms, Leonard J


    Computerized adaptive testing (CAT) is an emerging technology in the personality assessment literature given the greater efficiency it affords compared with traditional methods. However, few studies have directly compared the efficiency and validity of 2 competing methods for personality CAT: (a) methods based on item response theory (IRT-CAT) versus (b) methods based on the countdown method (CM-CAT). To that end, we conducted real-data simulations with previously collected responses (N = 8,690) to the Schedule for Nonadaptive and Adaptive Personality (SNAP). Three CAT algorithms (IRT-CAT, IRT-CAT with 5-item minimum, CM-CAT) were evaluated for item savings, classification accuracy, and convergent/discriminant validity. All CATs yielded lower classification accuracy and validity than traditional testing but required 18%-86% fewer items. Ultimately, the IRT-CAT, with minimum 5-item requirement, struck the most ideal balance of highest item savings, and generally fewer costs to validity and accuracy. These results confirm findings regarding item savings trends from previous CAT studies. In addition, this study provides a model for how the validity and precision of CATs may be compared across personality assessments.

  13. The Best of Both Worlds: Factor Analysis of Dichotomous Data Using Item Response Theory and Structural Equation Modeling (United States)

    Glockner-Rist, Angelika; Hoijtink, Herbert


    Both structural equation modeling (SEM) and item response theory (IRT) can be used for factor analysis of dichotomous item responses. In this case, the measurement models of both approaches are formally equivalent. They were refined within and across different disciplines, and make complementary contributions to central measurement problems…

  14. Methodology Review: Nonparametric IRT Approaches to the Analysis of Dichotomous Item Scores. (United States)

    Sijtsma, Klaas


    Reviews developments in nonparametric item-response theory (NIRT), from its historic origins in item-response theory (IRT) and scale analysis to new theoretical results for practical test construction. Discusses theoretical results from NIRT often relevant to IRT. Contains 134 references. (SLD)

  15. Numerical Differentiation Methods for Computing Error Covariance Matrices in Item Response Theory Modeling: An Evaluation and a New Proposal (United States)

    Tian, Wei; Cai, Li; Thissen, David; Xin, Tao


    In item response theory (IRT) modeling, the item parameter error covariance matrix plays a critical role in statistical inference procedures. When item parameters are estimated using the EM algorithm, the parameter error covariance matrix is not an automatic by-product of item calibration. Cai proposed the use of Supplemented EM algorithm for…

  16. Model theory and applications

    CERN Document Server

    Belegradek, OV


    This volume is a collection of papers on model theory and its applications. The longest paper, "Model Theory of Unitriangular Groups" by O. V. Belegradek, forms a subtle general theory behind Mal‴tsev's famous correspondence between rings and groups. This is the first published paper on the topic. Given the present model-theoretic interest in algebraic groups, Belegradek's work is of particular interest to logicians and algebraists. The rest of the collection consists of papers on various questions of model theory, mainly on stability theory. Contributors are leading Russian researchers in the

  17. The Relationship between CTT and IRT Approaches in Analyzing Item Characteristics (United States)

    Abedalaziz, Nabeel; Leng, Chin Hai


    Most of the tests and inventories used by counseling psychologists have been developed using CTT; IRT derives from what is called latent trait theory. A number of important differences exist between CTT- versus IRT-based approaches to both test development and evaluation, as well as the process of scoring the response profiles of individual…

  18. Examining Differential Item Functioning: IRT-Based Detection in the Framework of Confirmatory Factor Analysis (United States)

    Dimitrov, Dimiter M.


    This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.

  19. Measuring individual significant change on the Beck Depression Inventory-II through IRT-based statistics

    NARCIS (Netherlands)

    Brouwer, Danny; Meijer, Rob; Zevalkink, J.


    Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual

  20. Model theory and modules

    CERN Document Server

    Prest, M


    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  1. Desenvolvimento de uma escala para medir o potencial empreendedor utilizando a Teoria da Resposta ao Item (TRI Development of a scale to measure the entrepreneurial potential using the Item Response Theory (IRT

    Directory of Open Access Journals (Sweden)

    Luciano Ricardo Rath Alves


    referenced in theories of entrepreneur's personality. The samples include 664 undergraduate and graduate students of Brazilian universitie, and 100 entrepreneurs of the state of Alagoas. A two- parameter logistic IRT model was used. The parameter estimates were obtained from a sample of 764 people who responded to an instrument containing 103 items. The information and the standard error curves and the qualitative interpretation of the scale levels allowed us to determine the most appropriate range for the instrument use. The results showed that the scale is most adequate to evaluate individuals with low to moderately high entrepreneurial potential. Therefore, it is suggested that new items are incorporated into the instrument to measure and interpret even higher levels. The Item Response Theory allows the calibration of new items to measure entrepreneurs with high entrepreneurial potential using previously obtained data.

  2. A New Item Response Theory Model for Open-Ended Online Homework with Multiple Allowed Attempts

    CERN Document Server

    Gönülateş, Emre


    Item Response Theory (IRT) was originally developed in traditional exam settings, and it has been shown that the model does not readily transfer to formative assessment in the form of online homework. We investigate if this is mostly due to learner traits that do not become apparent in exam settings, namely random guessing due to lack of diligence or dedication, and copying work from other students or resources. Both of these traits mask the true ability of the learner, which is the only trait considered in most mainstream unidimensional IRT models. We find that indeed the introduction of these traits allows to better assess the true ability of the learners, as well as to better gauge the quality of assessment items. Correspondence of the model traits to self-reported behavior is investigated and confirmed. We find that of these two traits, copying answers has a larger influence on initial homework attempts than random guessing.

  3. A Longitudinal Item Response Theory Model to Characterize Cognition Over Time in Elderly Subjects (United States)

    Bornkamp, Björn; Krahnke, Tillmann; Mielke, Johanna; Monsch, Andreas; Quarg, Peter


    For drug development in neurodegenerative diseases such as Alzheimer's disease, it is important to understand which cognitive domains carry the most information on the earliest signs of cognitive decline, and which subject characteristics are associated with a faster decline. A longitudinal Item Response Theory (IRT) model was developed for the Basel Study on the Elderly, in which the Consortium to Establish a Registry for Alzheimer's Disease – Neuropsychological Assessment Battery (with additions) and the California Verbal Learning Test were measured on 1,750 elderly subjects for up to 13.9 years. The model jointly captured the multifaceted nature of cognition and its longitudinal trajectory. The word list learning and delayed recall tasks carried the most information. Greater age at baseline, fewer years of education, and positive APOEɛ4 carrier status were associated with a faster cognitive decline. Longitudinal IRT modeling is a powerful approach for progressive diseases with multifaceted endpoints. PMID:28643388

  4. Extended Mixed-Efects Item Response Models with the MH-RM Algorithm (United States)

    Chalmers, R. Philip


    A mixed-effects item response theory (IRT) model is presented as a logical extension of the generalized linear mixed-effects modeling approach to formulating explanatory IRT models. Fixed and random coefficients in the extended model are estimated using a Metropolis-Hastings Robbins-Monro (MH-RM) stochastic imputation algorithm to accommodate for…

  5. A Comparison of General Diagnostic Models (GDM) and Bayesian Networks Using a Middle School Mathematics Test (United States)

    Wu, Haiyan


    General diagnostic models (GDMs) and Bayesian networks are mathematical frameworks that cover a wide variety of psychometric models. Both extend latent class models, and while GDMs also extend item response theory (IRT) models, Bayesian networks can be parameterized using discretized IRT. The purpose of this study is to examine similarities and…

  6. A signal detection-item response theory model for evaluating neuropsychological measures. (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G


    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  7. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data (United States)

    Maydeu-Olivares, Albert


    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  8. Invariance Properties for General Diagnostic Classification Models (United States)

    Bradshaw, Laine P.; Madison, Matthew J.


    In item response theory (IRT), the invariance property states that item parameter estimates are independent of the examinee sample, and examinee ability estimates are independent of the test items. While this property has long been established and understood by the measurement community for IRT models, the same cannot be said for diagnostic…

  9. Improving the quality and applicability of the Dutch scales of the communication profile for the hearing impaired using item response theory

    NARCIS (Netherlands)

    Mokkink, L.B.; Knol, D.L.; van Nispen, R.M.A.; Kramer, S.E.


    Purpose: The aim of this study was to improve the quality and applicability of the 6 Dutch scales of the Communication Profile for the Hearing Impaired (CPHI; Demorest & Erdman, 1986, 1987, 1988) using item response theory (IRT). IRT modeling can produce precise, valid, and relatively brief

  10. An item response theory analysis of Harter’s self-perception profile for children or why strong clinical scales should be distrusted

    NARCIS (Netherlands)

    Egberink, I.J.L.; Meijer, R.R.


    The authors investigated the psychometric properties of the subscales of the Self-Perception Profile for Children with item response theory (IRT) models using a sample of 611 children. Results from a nonparametric Mokken analysis and a parametric IRT approach for boys (n = 268) and girls (n = 343)

  11. Applying Item Response Theory to the Development of a Screening Adaptation of the Goldman-Fristoe Test of Articulation-Second Edition (United States)

    Brackenbury, Tim; Zickar, Michael J.; Munson, Benjamin; Storkel, Holly L.


    Purpose: Item response theory (IRT) is a psychometric approach to measurement that uses latent trait abilities (e.g., speech sound production skills) to model performance on individual items that vary by difficulty and discrimination. An IRT analysis was applied to preschoolers' productions of the words on the Goldman-Fristoe Test of…

  12. An Item Response Theory Analysis of Harter's Self-Perception Profile for Children or Why Strong Clinical Scales Should Be Distrusted (United States)

    Egberink, Iris J. L.; Meijer, Rob R.


    The authors investigated the psychometric properties of the subscales of the Self-Perception Profile for Children with item response theory (IRT) models using a sample of 611 children. Results from a nonparametric Mokken analysis and a parametric IRT approach for boys (n = 268) and girls (n = 343) were compared. The authors found that most scales…

  13. An item response theory analysis of Harter's Self-Perception Profile for Children or why strong clinical scales should be distrusted

    NARCIS (Netherlands)

    Egberink, Iris J. L.; Meijer, Rob R.

    The authors investigated the psychometric properties of the subscales of the Self-Perception Profile for Children with item response theory (IRT) models using a sample of 611 children. Results from a nonparametric Mokken analysis and a parametric IRT approach for boys (n = 268) and girls (n = 343)

  14. An Introduction to Network Psychometrics: Relating Ising Network Models to Item Response Theory Models. (United States)

    Marsman, M; Borsboom, D; Kruis, J; Epskamp, S; van Bork, R; Waldorp, L J; Maas, H L J van der; Maris, G


    In recent years, network models have been proposed as an alternative representation of psychometric constructs such as depression. In such models, the covariance between observables (e.g., symptoms like depressed mood, feelings of worthlessness, and guilt) is explained in terms of a pattern of causal interactions between these observables, which contrasts with classical interpretations in which the observables are conceptualized as the effects of a reflective latent variable. However, few investigations have been directed at the question how these different models relate to each other. To shed light on this issue, the current paper explores the relation between one of the most important network models-the Ising model from physics-and one of the most important latent variable models-the Item Response Theory (IRT) model from psychometrics. The Ising model describes the interaction between states of particles that are connected in a network, whereas the IRT model describes the probability distribution associated with item responses in a psychometric test as a function of a latent variable. Despite the divergent backgrounds of the models, we show a broad equivalence between them and also illustrate several opportunities that arise from this connection.

  15. Theory and modeling group (United States)

    Holman, Gordon D.


    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  16. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson


    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  17. Linking Parameters Estimated with the Generalized Graded Unfolding Model: A Comparison of the Accuracy of Characteristic Curve Methods (United States)

    Anderson Koenig, Judith; Roberts, James S.


    Methods for linking item response theory (IRT) parameters are developed for attitude questionnaire responses calibrated with the generalized graded unfolding model (GGUM). One class of IRT linking methods derives the linking coefficients by comparing characteristic curves, and three of these methods---test characteristic curve (TCC), item…

  18. Rasch Model Parameter Estimation in the Presence of a Nonnormal Latent Trait Using a Nonparametric Bayesian Approach (United States)

    Finch, Holmes; Edwards, Julianne M.


    Standard approaches for estimating item response theory (IRT) model parameters generally work under the assumption that the latent trait being measured by a set of items follows the normal distribution. Estimation of IRT parameters in the presence of nonnormal latent traits has been shown to generate biased person and item parameter estimates. A…

  19. Model Theory for Process Algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.


    We present a first-order extension of the algebraic theory about processes known as ACP and its main models. Useful predicates on processes, such as deadlock freedom and determinism, can be added to this theory through first-order definitional extensions. Model theory is used to analyse the




  1. An introduction to item response theory for patient-reported outcome measurement. (United States)

    Nguyen, Tam H; Han, Hae-Ra; Kim, Miyong T; Chan, Kitty S


    The growing emphasis on patient-centered care has accelerated the demand for high-quality data from patient-reported outcome (PRO) measures. Traditionally, the development and validation of these measures has been guided by classical test theory. However, item response theory (IRT), an alternate measurement framework, offers promise for addressing practical measurement problems found in health-related research that have been difficult to solve through classical methods. This paper introduces foundational concepts in IRT, as well as commonly used models and their assumptions. Existing data on a combined sample (n = 636) of Korean American and Vietnamese American adults who responded to the High Blood Pressure Health Literacy Scale and the Patient Health Questionnaire-9 are used to exemplify typical applications of IRT. These examples illustrate how IRT can be used to improve the development, refinement, and evaluation of PRO measures. Greater use of methods based on this framework can increase the accuracy and efficiency with which PROs are measured.

  2. Deception and false belief in paranoia: modelling theory of mind stories. (United States)

    Shryane, Nick M; Corcoran, Rhiannon; Rowse, Georgina; Moore, Rosanne; Cummins, Sinead; Blackwood, Nigel; Howard, Robert; Bentall, Richard P


    This study used Item Response Theory (IRT) to model the psychometric properties of a Theory of Mind (ToM) stories task. The study also aimed to determine whether the ability to understand states of false belief in others and the ability to understand another's intention to deceive are separable skills, and to establish which is more sensitive to the presence of paranoia. A large and diverse clinical and nonclinical sample differing in levels of depression and paranoid ideation performed a ToM stories task measuring false belief and deception at first and second order. A three-factor IRT model was found to best fit the data, consisting of first- and second-order deception factors and a single false-belief factor. The first-order deception and false-belief factors had good measurement properties at low trait levels, appropriate for samples with reduced ToM ability. First-order deception and false beliefs were both sensitive to paranoid ideation with IQ predicting performance on false belief items. Separable abilities were found to underlie performance on verbal ToM tasks. However, paranoia was associated with impaired performance on both false belief and deception understanding with clear impairment at the simplest level of mental state attribution.

  3. Item response theory and the measurement of psychiatric constructs: some empirical and conceptual issues and challenges. (United States)

    Reise, S P; Rodriguez, A


    Item response theory (IRT) measurement models are now commonly used in educational, psychological, and health-outcomes measurement, but their impact in the evaluation of measures of psychiatric constructs remains limited. Herein we present two, somewhat contradictory, theses. The first is that, when skillfully applied, IRT has much to offer psychiatric measurement in terms of scale development, psychometric analysis, and scoring. The second argument, however, is that psychiatric measurement presents some unique challenges to the application of IRT - challenges that may not be easily addressed by application of conventional IRT models and methods. These challenges include, but are not limited to, the modeling of conceptually narrow constructs and their associated limited item pools, and unipolar constructs where the expected latent trait distribution is highly skewed.

  4. Combining item response theory and diagnostic classification models: a psychometric model for scaling ability and diagnosing misconceptions. (United States)

    Bradshaw, Laine; Templin, Jonathan


    Traditional testing procedures typically utilize unidimensional item response theory (IRT) models to provide a single, continuous estimate of a student's overall ability. Advances in psychometrics have focused on measuring multiple dimensions of ability to provide more detailed feedback for students, teachers, and other stakeholders. Diagnostic classification models (DCMs) provide multidimensional feedback by using categorical latent variables that represent distinct skills underlying a test that students may or may not have mastered. The Scaling Individuals and Classifying Misconceptions (SICM) model is presented as a combination of a unidimensional IRT model and a DCM where the categorical latent variables represent misconceptions instead of skills. In addition to an estimate of ability along a latent continuum, the SICM model provides multidimensional, diagnostic feedback in the form of statistical estimates of probabilities that students have certain misconceptions. Through an empirical data analysis, we show how this additional feedback can be used by stakeholders to tailor instruction for students' needs. We also provide results from a simulation study that demonstrate that the SICM MCMC estimation algorithm yields reasonably accurate estimates under large-scale testing conditions.

  5. plink: An R Package for Linking Mixed-Format Tests Using IRT-Based Methods

    Directory of Open Access Journals (Sweden)

    Jonathan P. Weeks


    Full Text Available The R package plink has been developed to facilitate the linking of mixed-format tests for multiple groups under a common item design using unidimensional and multidimensional IRT-based methods. This paper presents the capabilities of the package in the context of the unidimensional methods. The package supports nine unidimensional item response models (the Rasch model, 1PL, 2PL, 3PL, graded response model, partial credit and generalized partial credit model, nominal response model, and multiple-choice model and four separate calibration linking methods (mean/sigma, mean/mean, Haebara, and Stocking-Lord. It also includes functions for importing item and/or ability parameters from common IRT software, conducting IRT true-score and observed-score equating, and plotting item response curves and parameter comparison plots.

  6. Lectures on algebraic model theory

    CERN Document Server

    Hart, Bradd


    In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.

  7. A Computational Theory of Modelling (United States)

    Rossberg, Axel G.


    A metatheory is developed which characterizes the relationship between a modelled system, which complies with some ``basic theory'', and a model, which does not, and yet reproduces important aspects of the modelled system. A model is represented by an (in a certain sense, s.b.) optimal algorithm which generates data that describe the model's state or evolution complying with a ``reduced theory''. Theories are represented by classes of (in a similar sense, s.b.) optimal algorithms that test if their input data comply with the theory. The metatheory does not prescribe the formalisms (data structure, language) to be used for the description of states or evolutions. Transitions to other formalisms and loss of accuracy, common to theory reduction, are explicitly accounted for. The basic assumption of the theory is that resources such as the code length (~ programming time) and the computation time for modelling and testing are costly, but the relative cost of each recourse is unknown. Thus, if there is an algorithm a for which there is no other algorithm b solving the same problem but using less of each recourse, then a is considered optimal. For tests (theories), the set X of wrongly admitted inputs is treated as another resource. It is assumed that X1 is cheaper than X2 when X1 ⊂ X2 (X1 ≠ X2). Depending on the problem, the algorithmic complexity of a reduced theory can be smaller or larger than that of the basic theory. The theory might help to distinguish actual properties of complex systems from mere mental constructs. An application to complex spatio-temporal patterns is discussed.

  8. Using a Linear Regression Method to Detect Outliers in IRT Common Item Equating (United States)

    He, Yong; Cui, Zhongmin; Fang, Yu; Chen, Hanwei


    Common test items play an important role in equating alternate test forms under the common item nonequivalent groups design. When the item response theory (IRT) method is applied in equating, inconsistent item parameter estimates among common items can lead to large bias in equated scores. It is prudent to evaluate inconsistency in parameter…

  9. A Comparison of IRT Proficiency Estimation Methods under Adaptive Multistage Testing (United States)

    Kim, Sooyeon; Moses, Tim; Yoo, Hanwook


    This inquiry is an investigation of item response theory (IRT) proficiency estimators' accuracy under multistage testing (MST). We chose a two-stage MST design that includes four modules (one at Stage 1, three at Stage 2) and three difficulty paths (low, middle, high). We assembled various two-stage MST panels (i.e., forms) by manipulating two…

  10. Mokken scale analysis : Between the Guttman scale and parametric item response theory

    NARCIS (Netherlands)

    van Schuur, Wijbrandt H.


    This article introduces a model of ordinal unidimensional measurement known as Mokken scale analysis. Mokken scaling is based on principles of Item Response Theory (IRT) that originated in the Guttman scale. I compare the Mokken model with both Classical Test Theory (reliability or factor analysis)

  11. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J


    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  12. The suitability of the South Oaks Gambling Screen-Revised for Adolescents (SOGS-RA) as a screening tool: IRT-based evidence. (United States)

    Chiesi, Francesca; Donati, Maria Anna; Galli, Silvia; Primi, Caterina


    The South Oaks Gambling Screen-Revised for Adolescents (SOGS-RA) is one of the most widely used measures of adolescent gambling. We aimed to provide evidence of its suitability as a screening tool applying item response theory (IRT). The scale was administered to 981 adolescents (64% males; mean age = 16.57 years, SD = 1.63 years) attending high school. Analyses were carried out with a sample of 871 respondents, that is, adolescents who have gambled at least once during the previous year. Once the prerequisite of unidimensionality was confirmed through confirmatory factor analysis, unidimensional IRT analyses were performed. The 2-parameter logistic model was used in order to estimate item parameters (severity and discrimination) and the test information function. Results showed that item severity ranged from medium to high, and most of the items showed large discrimination parameters, indicating that the scale accurately measures medium to high levels of problem gambling. These regions of the trait were associated with the greatest amount of information, indicating that the SOGS-RA provides a reliable measure for identifying both problem gamblers and adolescents at risk of developing maladaptive behaviors deriving from gambling. The IRT-based evidence supports the suitability of the SOGS-RA as a screening tool in adolescent populations. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  13. Bayesian inference in an item response theory model with a generalized student t link function (United States)

    Azevedo, Caio L. N.; Migon, Helio S.


    In this paper we introduce a new item response theory (IRT) model with a generalized Student t-link function with unknown degrees of freedom (df), named generalized t-link (GtL) IRT model. In this model we consider only the difficulty parameter in the item response function. GtL is an alternative to the two parameter logit and probit models, since the degrees of freedom (df) play a similar role to the discrimination parameter. However, the behavior of the curves of the GtL is different from those of the two parameter models and the usual Student t link, since in GtL the curve obtained from different df's can cross the probit curves in more than one latent trait level. The GtL model has similar proprieties to the generalized linear mixed models, such as the existence of sufficient statistics and easy parameter interpretation. Also, many techniques of parameter estimation, model fit assessment and residual analysis developed for that models can be used for the GtL model. We develop fully Bayesian estimation and model fit assessment tools through a Metropolis-Hastings step within Gibbs sampling algorithm. We consider a prior sensitivity choice concerning the degrees of freedom. The simulation study indicates that the algorithm recovers all parameters properly. In addition, some Bayesian model fit assessment tools are considered. Finally, a real data set is analyzed using our approach and other usual models. The results indicate that our model fits the data better than the two parameter models.

  14. IRT studies of many groups: The alignment method

    Directory of Open Access Journals (Sweden)

    Bengt eMuthen


    Full Text Available Asparouhov and Muthen (forthcoming presented a new method for multiple-group confirmatory factor analysis (CFA, referred to as the alignment method. The alignment method can be used to estimate group-specific factor means and variances without requiring exact measurement invariance. A strength of the method is the ability to conveniently estimate models for many groups, such as with comparisons of countries. This paper focuses on IRT applications of the alignment method. An empirical investigation is made of binary knowledge items administered in two separate surveys of a set of countries. A Monte Carlo study is presented that shows how the quality of the alignment can be assessed.

  15. Extending item response theory to online homework

    Directory of Open Access Journals (Sweden)

    Gerd Kortemeyer


    Full Text Available Item response theory (IRT becomes an increasingly important tool when analyzing “big data” gathered from online educational venues. However, the mechanism was originally developed in traditional exam settings, and several of its assumptions are infringed upon when deployed in the online realm. For a large-enrollment physics course for scientists and engineers, the study compares outcomes from IRT analyses of exam and homework data, and then proceeds to investigate the effects of each confounding factor introduced in the online realm. It is found that IRT yields the correct trends for learner ability and meaningful item parameters, yet overall agreement with exam data is moderate. It is also found that learner ability and item discrimination is robust over a wide range with respect to model assumptions and introduced noise. Item difficulty is also robust, but over a narrower range.

  16. Characterization of Cubic Graphs G with irt(G = Irt(G = 2

    Directory of Open Access Journals (Sweden)

    Eslahchi Changiz


    Full Text Available A subset S of vertices in a graph G is called a total irredundant set if, for each vertex v in G, v or one of its neighbors has no neighbor in S −{v}. The total irredundance number, ir(G, is the minimum cardinality of a maximal total irredundant set of G, while the upper total irredundance number, IR(G, is the maximum cardinality of a such set. In this paper we characterize all cubic graphs G with irt(G = IRt(G = 2

  17. Stochastic Climate Theory and Modelling

    CERN Document Server

    Franzke, Christian L E; Berner, Judith; Williams, Paul D; Lucarini, Valerio


    Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations as well as for model error representation, uncertainty quantification, data assimilation and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochast...

  18. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST) (United States)

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David


    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  19. Revision of the ICIDH Severity of Disabilities Scale by data linking and item response theory

    NARCIS (Netherlands)

    Buuren, S. van; Hopman-Rock, M.


    The Severity of Disabilities Scale (SDS) of the ICIDH reflects the degree to which an individual's ability to perform a certain activity is restricted. This paper describes the application of two models from item response theory (IRT), the graded response model and the partial credit model, in order

  20. Comparing five depression measures in depressed Chinese patients using item response theory: an examination of item properties, measurement precision and score comparability. (United States)

    Zhao, Yue; Chan, Wai; Lo, Barbara Chuen Yee


    Item response theory (IRT) has been increasingly applied to patient-reported outcome (PRO) measures. The purpose of this study is to apply IRT to examine item properties (discrimination and severity of depressive symptoms), measurement precision and score comparability across five depression measures, which is the first study of its kind in the Chinese context. A clinical sample of 207 Hong Kong Chinese outpatients was recruited. Data analyses were performed including classical item analysis, IRT concurrent calibration and IRT true score equating. The IRT assumptions of unidimensionality and local independence were tested respectively using confirmatory factor analysis and chi-square statistics. The IRT linking assumptions of construct similarity, equity and subgroup invariance were also tested. The graded response model was applied to concurrently calibrate all five depression measures in a single IRT run, resulting in the item parameter estimates of these measures being placed onto a single common metric. IRT true score equating was implemented to perform the outcome score linking and construct score concordances so as to link scores from one measure to corresponding scores on another measure for direct comparability. Findings suggested that (a) symptoms on depressed mood, suicidality and feeling of worthlessness served as the strongest discriminating indicators, and symptoms concerning suicidality, changes in appetite, depressed mood, feeling of worthlessness and psychomotor agitation or retardation reflected high levels of severity in the clinical sample. (b) The five depression measures contributed to various degrees of measurement precision at varied levels of depression. (c) After outcome score linking was performed across the five measures, the cut-off scores led to either consistent or discrepant diagnoses for depression. The study provides additional evidence regarding the psychometric properties and clinical utility of the five depression measures

  1. Using the GLIMMIX Procedure in SAS 9.3 to Fit a Standard Dichotomous Rasch and Hierarchical 1-PL IRT Model (United States)

    Black, Ryan A.; Butler, Stephen F.


    Although Rasch models have been shown to be a sound methodological approach to develop and validate measures of psychological constructs for more than 50 years, they remain underutilized in psychology and other social sciences. Until recently, one reason for this underutilization was the lack of syntactically simple procedures to fit Rasch and…

  2. Psychometric properties and adaptation of the ASRS in a Spanish sample of patients with substance use disorders: Application of two IRT Rasch models. (United States)

    Sanchez-Garcia, Manuel; Fernandez-Calderon, Fermin; Carmona-Marquez, Jose; Chico-Garcia, Marilo; Velez-Moreno, Antonio; Perez-Gomez, Lorena


    The Adult ADHD Self-Report Scale (ASRS; Kessler et al., 2005) is one of the most extensively used scales to detect attention-deficit hyperactivity disorder (ADHD) in adults. The aim of this work is to analyze the psychometric properties of the 18 ASRS items in people with substance use disorders (SUDs). Furthermore, we aimed to (a) confirm or, if necessary, modify the dichotomization criteria of the items proposed by the authors, and (b) identify the most informative items for a screening version or, when applicable, confirm the use of the 6 items that comprise the initially proposed short version. The ASRS was completed for 170 patients with SUD at the Provincial Unit for Drug Dependence of Huelva, Spain, aged 16 to 78 years. Two Rasch models—the dichotomous Rasch model and the Rating Scale Model (RSM) for polytomous items—were used in the psychometric analysis. The ASRS items fitted the RSM adequately, but the locations of the items along the underlying construct led us to propose new criteria of dichotomization. After analyzing the information function of dichotomized items, we identified 6 items that should integrate a new screening scale. Our dichotomization proposal is different from the original one and takes into account the different weights of the items. The selected screening version showed better metric properties than the other analyzed versions. Future research should test our proposal by using external criteria and to obtain evidences for other populations, cultures, or patient profiles. (c) 2015 APA, all rights reserved).

  3. Models in cooperative game theory

    CERN Document Server

    Branzei, Rodica; Tijs, Stef


    This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.

  4. Testing item response theory invariance of the standardized Quality-of-life Disease Impact Scale (QDIS(®)) in acute coronary syndrome patients: differential functioning of items and test. (United States)

    Deng, Nina; Anatchkova, Milena D; Waring, Molly E; Han, Kyung T; Ware, John E


    The Quality-of-life (QOL) Disease Impact Scale (QDIS(®)) standardizes the content and scoring of QOL impact attributed to different diseases using item response theory (IRT). This study examined the IRT invariance of the QDIS-standardized IRT parameters in an independent sample. The differential functioning of items and test (DFIT) of a static short-form (QDIS-7) was examined across two independent sources: patients hospitalized for acute coronary syndrome (ACS) in the TRACE-CORE study (N = 1,544) and chronically ill US adults in the QDIS standardization sample. "ACS-specific" IRT item parameters were calibrated and linearly transformed to compare to "standardized" IRT item parameters. Differences in IRT model-expected item, scale and theta scores were examined. The DFIT results were also compared in a standard logistic regression differential item functioning analysis. Item parameters estimated in the ACS sample showed lower discrimination parameters than the standardized discrimination parameters, but only small differences were found for thresholds parameters. In DFIT, results on the non-compensatory differential item functioning index (range 0.005-0.074) were all below the threshold of 0.096. Item differences were further canceled out at the scale level. IRT-based theta scores for ACS patients using standardized and ACS-specific item parameters were highly correlated (r = 0.995, root-mean-square difference = 0.09). Using standardized item parameters, ACS patients scored one-half standard deviation higher (indicating greater QOL impact) compared to chronically ill adults in the standardization sample. The study showed sufficient IRT invariance to warrant the use of standardized IRT scoring of QDIS-7 for studies comparing the QOL impact attributed to acute coronary disease and other chronic conditions.

  5. Avaliação de atitudes de estudantes de psicologia via modelo de crédito parcial da TRI Assessment of psychology students' attitudes through credit partial model of IRT

    Directory of Open Access Journals (Sweden)

    Claudette Maria Medeiros Vendramini


    Full Text Available O objetivo deste trabalho foi avaliar as atitudes de estudantes de Psicologia em relação a estatística, via modelo de créditos parciais da TRI, e suas relações com a autopercepção e desempenho em estatística. Uma amostra não aleatória de 361 estudantes de Psicologia, com idades de 18 a 65 anos, 81% mulheres e 53% do noturno, respondeu a um questionário de identificação e uma escala de atitudes. A escala é do tipo likert de quatro pontos e composta de 20 itens que expressam os sentimentos em relação a estatística, sendo dez positivos e dez negativos, e um item complementar, que verifica a autopercepção do universitário em relação ao próprio desempenho em estatística. Observou-se que a escala é fidedigna e válida para medir as atitudes. Os participantes apresentaram atitudes ligeiramente mais negativas do que positivas. Constatou-se a existência de correlações positivas e significativas entre atitude, desempenho acadêmico e autopercepção de desempenho.The aim of this work was to assess psychology students' attitudes toward statistics trough credit partial model of IRT, and to identify the association among the students' attitudes, academic performance, and self-perception in Statistics. A not random sample of 361 Psychology students answered the identification questionnaire and the attitudes scale towards Statistics. The students aged 18-65, 81% were female and 53% from evening classes. The likert scale is composed of 20 items, ten positives and ten negatives, which express the feelings towards Statistics. There is one item which verifies the university student's self-perception towards its own performance in Statistics. It was observed that the scale was reliable and valid to measure attitudes. The students presented their attitudes slightly more negative than positive. It was noticed the existence of positive and significant correlations among attitudes, academic performance and performance self-perception.

  6. Modeling Diagnostic Assessments with Bayesian Networks (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego


    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  7. Variance decomposition using an IRT measurement model

    NARCIS (Netherlands)

    van den Berg, Stéphanie Martine; van den Berg, Stephanie M.; Glas, Cornelis A.W.; Boomsma, Dorret I.


    Large scale research projects in behaviour genetics and genetic epidemiology are often based on questionnaire or interview data. Typically, a number of items is presented to a number of subjects, the subjects’ sum scores on the items are computed, and the variance of sum scores is decomposed into a

  8. Phase transition in IrTe2 induced by spin-orbit coupling (United States)

    Koley, S.


    IrTe2 has been renewed as an interesting system showing competing phenomenon between a questionable density-wave transition near 270 K followed by superconductivity with doping of high atomic number materials. Higher atomic numbers of Te and Ir supports strong spin-orbital coupling in this system. Using dynamical mean field theory with LDA band structure I have introduced Rashba spin orbit coupling in this system to get the interpretation for anomalous resistivity and related transition in this system. While no considerable changes are observed in DMFT results of Ir-5d band other than orbital selective pseudogap 'pinned' to Fermi level, Te-p band shows a van Hove singularity at the Fermi level except low temperature. Finally I discuss the implications of these results in theoretical understanding of ordering in IrTe2.

  9. A new IRT-based standard setting method: application to eCat-listening. (United States)

    García, Pablo Eduardo; Abad, Francisco José; Olea, Julio; Aguado, David


    Criterion-referenced interpretations of tests are highly necessary, which usually involves the difficult task of establishing cut scores. Contrasting with other Item Response Theory (IRT)-based standard setting methods, a non-judgmental approach is proposed in this study, in which Item Characteristic Curve (ICC) transformations lead to the final cut scores. eCat-Listening, a computerized adaptive test for the evaluation of English Listening, was administered to 1,576 participants, and the proposed standard setting method was applied to classify them into the performance standards of the Common European Framework of Reference for Languages (CEFR). The results showed a classification closely related to relevant external measures of the English language domain, according to the CEFR. It is concluded that the proposed method is a practical and valid standard setting alternative for IRT-based tests interpretations.

  10. Halo modelling in chameleon theories

    Energy Technology Data Exchange (ETDEWEB)

    Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail:, E-mail:, E-mail: [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)


    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  11. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.


    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  12. Economic Modelling in Institutional Economic Theory

    National Research Council Canada - National Science Library

    Wadim Strielkowski; Evgeny Popov


    Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory...

  13. Comparison of examination grades using item response theory : a case study

    NARCIS (Netherlands)

    Korobko, O.B.


    In item response theory (IRT), mathematical models are applied to analyze data from tests and questionnaires used to measure abilities, proficiency, personality traits and attitudes. This thesis is concerned with comparison of subjects, students and schools based on average examination grades using

  14. A Regional and Local Item Response Theory Based Test Item Bank System. (United States)

    Hathaway, Walter; And Others

    This report describes the development, operation, maintenance, and future prospects of the item banks pioneered by the Portland (Oregon) School District. At the time of this report, there were 3,500 mathematics, 2,200 reading, and 2,300 language usage items calibrated under the fixed parameter model of item response theory (IRT) for Grades 3-8.…

  15. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz


    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  16. Cross-informant and cross-national equivalence using item-response theory (IRT) linking: A case study using the behavioral assessment for children of African heritage in the United States and Jamaica. (United States)

    Lambert, Michael Canute; Ferguson, Gail M; Rowan, George T


    Cross-national study of adolescents' psychological adjustment requires measures that permit reliable and valid assessment across informants and nations, but such measures are virtually nonexistent. Item-response-theory-based linking is a promising yet underutilized methodological procedure that permits more accurate assessment across informants and nations. To demonstrate this procedure, the Resilience Scale of the Behavioral Assessment for Children of African Heritage (Lambert et al., 2005) was administered to 250 African American and 294 Jamaican nonreferred adolescents and their caregivers. Multiple items without significant differential item functioning emerged, allowing scale linking across informants and nations. Calibrating item parameters via item response theory linking can permit cross-informant cross-national assessment of youth. (c) 2016 APA, all rights reserved).

  17. Evaluating model assumptions in item response theory

    NARCIS (Netherlands)

    Tijmstra, J.


    This dissertation deals with the evaluation of model assumptions in the context of item response theory. Item response theory, also known as modern test theory, provides a statistical framework for the measurement of psychological constructs that cannot by observed directly, such as intelligence or

  18. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose


    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  19. Quantum field theory competitive models

    CERN Document Server

    Tolksdorf, Jürgen; Zeidler, Eberhard


    For more than 70 years, quantum field theory (QFT) can be seen as a driving force in the development of theoretical physics. Equally fascinating is the fruitful impact which QFT had in rather remote areas of mathematics. The present book features some of the different approaches, different physically viewpoints and techniques used to make the notion of quantum field theory more precise. For example, the present book contains a discussion including general considerations, stochastic methods, deformation theory and the holographic AdS/CFT correspondence. It also contains a discussion of more recent developments like the use of category theory and topos theoretic methods to describe QFT. The present volume emerged from the 3rd 'Blaubeuren Workshop: Recent Developments in Quantum Field Theory', held in July 2007 at the Max Planck Institute of Mathematics in the Sciences in Leipzig/Germany. All of the contributions are committed to the idea of this workshop series: 'To bring together outstanding experts working in...

  20. Scale development with small samples: a new application of longitudinal item response theory. (United States)

    Houts, Carrie R; Morlock, Robert; Blum, Steven I; Edwards, Michael C; Wirth, R J


    Measurement development in hard-to-reach populations can pose methodological challenges. Item response theory (IRT) is a useful statistical tool, but often requires large samples. We describe the use of longitudinal IRT models as a pragmatic approach to instrument development when large samples are not feasible. The statistical foundations and practical benefits of longitudinal IRT models are briefly described. Results from a simulation study are reported to demonstrate the model's ability to recover the generating measurement structure and parameters using a range of sample sizes, number of items, and number of time points. An example using early-phase clinical trial data in a rare condition demonstrates these methods in practice. Simulation study results demonstrate that the longitudinal IRT model's ability to recover the generating parameters rests largely on the interaction between sample size and the number of time points. Overall, the model performs well even in small samples provided a sufficient number of time points are available. The clinical trial data example demonstrates that by using conditional, longitudinal IRT models researchers can obtain stable estimates of psychometric characteristics from samples typically considered too small for rigorous psychometric modeling. Capitalizing on repeated measurements, it is possible to estimate psychometric characteristics for an assessment even when sample size is small. This allows researchers to optimize study designs and have increased confidence in subsequent comparisons using scores obtained from such models. While there are limitations and caveats to consider when using these models, longitudinal IRT modeling may be especially beneficial when developing measures for rare conditions and diseases in difficult-to-reach populations.

  1. Item response theory for measurement validity. (United States)

    Yang, Frances M; Kao, Solon T


    Item response theory (IRT) is an important method of assessing the validity of measurement scales that is underutilized in the field of psychiatry. IRT describes the relationship between a latent trait (e.g., the construct that the scale proposes to assess), the properties of the items in the scale, and respondents' answers to the individual items. This paper introduces the basic premise, assumptions, and methods of IRT. To help explain these concepts we generate a hypothetical scale using three items from a modified, binary (yes/no) response version of the Center for Epidemiological Studies-Depression scale that was administered to 19,399 respondents. We first conducted a factor analysis to confirm the unidimensionality of the three items and then proceeded with Mplus software to construct the 2-Parameter Logic (2-PL) IRT model of the data, a method which allows for estimates of both item discrimination and item difficulty. The utility of this information both for clinical purposes and for scale construction purposes is discussed.

  2. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel


    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  3. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan


    In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet......, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures...

  4. Domain Theory, Its Models and Concepts

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt


    , which can support design work and to form elements of designers’ mindsets and thereby their practice. The theory is a model-based theory, which means it is composed of concepts and models, which explains certain design phenomena. Many similar theories are described in the literature with differences...... and industrial applications especially for the DFX areas (not reported here) and for product modelling. The theory therefore contains a rich ontology of interrelated concepts. The Domain Theory is not aiming to create normative methods but the creation of a collection of concepts related to design phenomena...... in the set of concepts but assumingly all valid. The Domain Theory cannot be falsified or proven; but its value may be seen spanning from its range and productivity as described in the article....

  5. Bayesian modeling of measurement error in predictor variables

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.


    It is shown that measurement error in predictor variables can be modeled using item response theory (IRT). The predictor variables, that may be defined at any level of an hierarchical regression model, are treated as latent variables. The normal ogive model is used to describe the relation between

  6. Item response theory - A first approach (United States)

    Nunes, Sandra; Oliveira, Teresa; Oliveira, Amílcar


    The Item Response Theory (IRT) has become one of the most popular scoring frameworks for measurement data, frequently used in computerized adaptive testing, cognitively diagnostic assessment and test equating. According to Andrade et al. (2000), IRT can be defined as a set of mathematical models (Item Response Models - IRM) constructed to represent the probability of an individual giving the right answer to an item of a particular test. The number of Item Responsible Models available to measurement analysis has increased considerably in the last fifteen years due to increasing computer power and due to a demand for accuracy and more meaningful inferences grounded in complex data. The developments in modeling with Item Response Theory were related with developments in estimation theory, most remarkably Bayesian estimation with Markov chain Monte Carlo algorithms (Patz & Junker, 1999). The popularity of Item Response Theory has also implied numerous overviews in books and journals, and many connections between IRT and other statistical estimation procedures, such as factor analysis and structural equation modeling, have been made repeatedly (Van der Lindem & Hambleton, 1997). As stated before the Item Response Theory covers a variety of measurement models, ranging from basic one-dimensional models for dichotomously and polytomously scored items and their multidimensional analogues to models that incorporate information about cognitive sub-processes which influence the overall item response process. The aim of this work is to introduce the main concepts associated with one-dimensional models of Item Response Theory, to specify the logistic models with one, two and three parameters, to discuss some properties of these models and to present the main estimation procedures.

  7. [Modern testing theory and its application in the field of health measurement]. (United States)

    Wu, Da-rong


    This paper briefly introduces item response theory (IRT) as a typical representation of modern testing theory (MTT), and systematically reviews the processes and contents of the application of IRT in the area of health measurement, including, for example, item bank development, scale revision and computerized adaptive testing. The author presents the potential benefits and the notable problems during health measuring by IRT. Then, the author asserts the need for thorough assessment of feasibility when using the IRT in patient-reported outcome research. Further research based on IRT and computerized adaptive testing in health measurement will be carried out in the field of medical care including traditional Chinese medicine and integrative medicine.

  8. Constraint theory multidimensional mathematical model management

    CERN Document Server

    Friedman, George J


    Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...

  9. A Test Characteristic Curve Linking Method for the Testlet Model (United States)

    Li, Yanmei; Bolt, Daniel M.; Fu, Jianbin


    When tests are made up of testlets, a testlet-based item response theory (IRT) model may be used to account for local dependence among items from a common testlet. This study presents a new test characteristic curve method to link calibrations based on the Bradlow, Wainer, and Wang (1999) testlet model. Procedures for calculating the test…

  10. Staircase Models from Affine Toda Field Theory

    CERN Document Server

    Dorey, P; Dorey, Patrick; Ravanini, Francesco


    We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.

  11. A Theory-Based Computer Tutorial Model. (United States)

    Dixon, Robert C.; Clapp, Elizabeth J.

    Because of the need for models to illustrate some possible answers to practical courseware development questions, a specific, three-section model incorporating the Corrective Feedback Paradigm (PCP) is advanced for applying theory to courseware. The model is reconstructed feature-by-feature against a framework of a hypothetical, one-to-one,…

  12. A course on basic model theory

    CERN Document Server

    Sarbadhikari, Haimanti


    This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.

  13. A Comparison between Linear IRT Observed-Score Equating and Levine Observed-Score Equating under the Generalized Kernel Equating Framework (United States)

    Chen, Haiwen


    In this article, linear item response theory (IRT) observed-score equating is compared under a generalized kernel equating framework with Levine observed-score equating for nonequivalent groups with anchor test design. Interestingly, these two equating methods are closely related despite being based on different methodologies. Specifically, when…

  14. Reducing Test Form Overlap of the GRE Subject Test in Mathematics Using IRT Triple-Part Equating. GRE Board Professional Report No. 86-14P. (United States)

    McKinley, Robert L.; Schaeffer, Gary A.

    A study was conducted to evaluate the feasibility of using item response theory (IRT) equating to reduce test form overlap of the Graduate Record Examinations (GRE) Subject Test in Mathematics. Monte Carlo methods were employed to compare double-part equating with 20-item common item blocks to triple-part equating with 10-item common item blocks.…

  15. Economic Modelling in Institutional Economic Theory

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski


    Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.

  16. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.


    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  17. Supersymmetric SYK model and random matrix theory (United States)

    Li, Tianlin; Liu, Junyu; Xin, Yuan; Zhou, Yehao


    In this paper, we investigate the effect of supersymmetry on the symmetry classification of random matrix theory ensembles. We mainly consider the random matrix behaviors in the N=1 supersymmetric generalization of Sachdev-Ye-Kitaev (SYK) model, a toy model for two-dimensional quantum black hole with supersymmetric constraint. Some analytical arguments and numerical results are given to show that the statistics of the supersymmetric SYK model could be interpreted as random matrix theory ensembles, with a different eight-fold classification from the original SYK model and some new features. The time-dependent evolution of the spectral form factor is also investigated, where predictions from random matrix theory are governing the late time behavior of the chaotic hamiltonian with supersymmetry.

  18. Graphical Model Theory for Wireless Sensor Networks

    Energy Technology Data Exchange (ETDEWEB)

    Davis, William B.


    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.

  19. Some Results in Dynamic Model Theory (United States)


    Science of Computer Programming 51 (2004) 3–22 Some results in dynamic model theory Dexter Kozen∗ Computer Science......models. At the /rst-order level, we recall the de/nition of Tarskian frames over a /rst-order signature . D. Kozen / Science of Computer Programming 51

  20. Security Theorems via Model Theory

    Directory of Open Access Journals (Sweden)

    Joshua Guttman


    Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.

  1. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo


    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  2. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D


    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  3. Studies on statistical models for polytomously scored test items

    NARCIS (Netherlands)

    Akkermans, Wies


    This dissertation, which is structured as a collection of self-contained papers, will be concerned mainly with di�erences between item response models. The purpose of item response theory (IRT) is estimation of a hypothesized latent variable, such as, for example, intelligence or ability in

  4. Applying multilevel item response theory to vision-related quality of life in Dutch visually impaired elderly. (United States)

    van Nispen, Ruth M A; Knol, Dirk L; Langelaan, Maaike; de Boer, Michiel R; Terwee, Caroline B; van Rens, Ger H M B


    Instead of applying the usual longitudinal methods to assess the outcome of low-vision rehabilitation services in terms of vision-related quality of life, a three-level Item Response Theory (IRT) method was proposed. The translated Vision-Related Quality of Life Core Measure (VCM1) and Low Vision Quality Of Life (LVQOL) questionnaires were used in a nonrandomized follow-up study among elderly patients (n = 296) referred to two different low-vision rehabilitation services in the Netherlands. Factor analysis was performed on the matrix of polychoric correlations to investigate (uni-)dimensionality and to prepare both questionnaires for the multilevel IRT analyses. A statistical model, which was characterized by a graded response model for rating scales, was developed. Threshold and item difficulty parameters and group by time-specific mean fixed effects were estimated. Random individual effects were predicted. Measurement invariance across occasions was tested. The VCM1 and the LVQOL "reading and fine work" dimension showed item parameter drift. In the multidisciplinary rehabilitation center patients, deterioration was found on the "mobility" dimension after 1 year and improvement was found on "adjustment" and "visual (motor) skills" after 5 months (p life. The results showed a change in only a limited number of individual patients. However, with regard to the field of low-vision rehabilitation, the proposed IRT method seemed to be successful in the follow-up of individuals. IRT specific software was unnecessary. The data did not have to be complete and the use of cumulative logits made the proposed IRT method an economical and efficient approach. Because of item parameter drift, the VCM1 was difficult to interpret. The use of multilevel IRT models with longitudinal data and dependent observations is recommended.

  5. Density functional theory and multiscale materials modeling*

    Indian Academy of Sciences (India)


    wide class of problems involving nanomaterials, interfacial science and soft condensed matter has been addressed using the density based ... Keywords. Density functional theory; soft condensed matter; materials modeling. 1. Introduction ... the basic laws of quantum mechanics, their prediction through a direct ab initio ...

  6. Aligning Grammatical Theories and Language Processing Models (United States)

    Lewis, Shevaun; Phillips, Colin


    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  7. Recursive renormalization group theory based subgrid modeling (United States)

    Zhou, YE


    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  8. Item response theory in the production of indicators of socioeconomic metropolitan region of Maringá, Paraná State, Brazil - doi: 10.4025/actascitechnol.v34i4.10478

    Directory of Open Access Journals (Sweden)

    Vanessa Rufino da Silva


    Full Text Available This study aimed to identify and produce through models of Item Response Theory (IRT a socio-economic indicator based in the items observed in 2000 Census, following the methodology by Soares (2005. By the IRT Methodology, this indicator, as a latent variable, is obtained through the construction of specific models and scales, making it possible to measure this variable, which according to Andrade et al. (2000, IRT analyzes each item which compose the measuring instrument. This case consists of binary or dichotomous items, which assess the possession of certain assets of domestic comfort. The characteristics of each item were analyzed, as the ability to discrimination and income necessary for the possession of certain property. It was concluded that with 13 items, a trustworthy questionnaire can be done for the construction of a socioeconomic index of Maringa’s metropolitan region.

  9. Lattice gauge theories and spin models (United States)

    Mathur, Manu; Sreeraj, T. P.


    The Wegner Z2 gauge theory-Z2 Ising spin model duality in (2 +1 ) dimensions is revisited and derived through a series of canonical transformations. The Kramers-Wannier duality is similarly obtained. The Wegner Z2 gauge-spin duality is directly generalized to SU(N) lattice gauge theory in (2 +1 ) dimensions to obtain the SU(N) spin model in terms of the SU(N) magnetic fields and their conjugate SU(N) electric scalar potentials. The exact and complete solutions of the Z2, U(1), SU(N) Gauss law constraints in terms of the corresponding spin or dual potential operators are given. The gauge-spin duality naturally leads to a new gauge invariant magnetic disorder operator for SU(N) lattice gauge theory which produces a magnetic vortex on the plaquette. A variational ground state of the SU(2) spin model with nearest neighbor interactions is constructed to analyze SU(2) gauge theory.

  10. Density Functional Theory Models for Radiation Damage (United States)

    Dudarev, S. L.


    Density functional theory models developed over the past decade provide unique information about the structure of nanoscale defects produced by irradiation and about the nature of short-range interaction between radiation defects, clustering of defects, and their migration pathways. These ab initio models, involving no experimental input parameters, appear to be as quantitatively accurate and informative as the most advanced experimental techniques developed for the observation of radiation damage phenomena. Density functional theory models have effectively created a new paradigm for the scientific investigation and assessment of radiation damage effects, offering new insight into the origin of temperature- and dose-dependent response of materials to irradiation, a problem of pivotal significance for applications.

  11. Crack propagation modeling using Peridynamic theory (United States)

    Hafezi, M. H.; Alebrahim, R.; Kundu, T.


    Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.

  12. Icing Research Tunnel (IRT) Force Measurement System (FMS) (United States)

    Roberts, Paul W.


    An Electronics Engineer at the Glenn Research Center (GRC), requested the NASA Engineering and Safety Center (NESC) provide technical support for an evaluation of the existing force measurement system (FMS) at the GRC's Icing Research Tunnel (IRT) with the intent of developing conceptual designs to improve the tunnel's force measurement capability in order to better meet test customer needs. This report contains the outcome of the NESC technical review.

  13. Clinical Application Of Advanced Infrared Thermography (IRT) In Locomotor Diseases (United States)

    Engel, Joachim-Michael


    Locomotor diseases is a wide range of about 450 different illnesses with all different pathologies, clinical and prognostic features and response to treatment. No single method will be able to cover the whole spectrum of local and systemic signs and symptoms. Nevertheless there is a need for objective measurements at the site of disease: clinical examination is often enough depending from subjective estimations and personal experiance of the clinician. Laboratory tests only show the systemic effect of the disease, like inflammation. X-rays are restricted to the detection of structural changes appearing late during the pathological process, even when using different techniques. Here IRT offers several advantages to the clinician as well as to the patient. As a non invasive method it monitors the course of disease at the anatomic site of pathology. Quantitative figures calculated from the thermogram,either taken at steady-state or during dynamic tests, are essential for differential diagnosis and follow-up. Advanced IRT camera systems fulfill all requirements set up for medical thermography recently by the National Bureau of Standards. Although, the user should check his system daily with regard to precision of absolute temperature measurements. Standardisation of recording technique is essential as well,to get reliable results. Ambient conditions must be adapted to the locomotor disease pathology under study. Advanced IRT systems , e.g. ZEISS-IKOTHERM, together with image processing capability and special software, e.g. THERMOTOM package, are valuable tools to the rheumatologist for diagnosing and monitoring locomotor diseases.

  14. The infrared-optical telescope (IRT) of the EXIST observatory. (United States)

    Kutyrev, Alexander; Bloom, Joshua; Gehrels, Neil; Golisano, Craig; Gong, Quan; Grindlay, Jonathan; Moseley, Samuel; Woodgate, Bruce

    The IRT is a 1.1m visible and infrared passively cooled telescope, which can locate, identify and obtain spectra of GRB afterglows at redshifts up to z 20. It will also acquire optical-IR imaging and spectroscopy of AGN and transients discovered by the EXIST (The Energetic X-ray Imaging Survey Telescope). The IRT imaging and spectroscopic capabilities cover a broad spectral range from 0.32.2m in four bands. The identical fields of view in the four instrument bands are each split in three subfields: imaging, objective prism slitless for the field and objective prism single object slit low resolution spectroscopy, and high resolution long slit on single object. This allows the instrument to do simultaneous broadband photometry or spectroscopy of the same object over the full spectral range, thus greatly improving the efficiency of the observatory and its detection limits. A prompt follow up (within three minutes) of the transient discovered by the EXIST makes IRT a unique tool for detection and study of these events, which is particularly valuable at wavelengths unavailable to the ground based observatories.

  15. The Infrared-Optical Telescope (IRT) of the Exist Observatory (United States)

    Kutyrev, Alexander; Bloom, Joshua; Gehrels, Neil; Golisano, Craig; Gong, Quan; Grindlay, Jonathan; Moseley, Samuel; Woodgate, Bruce


    The IRT is a 1.1m visible and infrared passively cooled telescope, which can locate, identify and obtain spectra of GRB afterglows at redshifts up to z 20. It will also acquire optical-IR, imaging and spectroscopy of AGN and transients discovered by the EXIST (The Energetic X-ray Imaging Survey Telescope). The IRT imaging and spectroscopic capabilities cover a broad spectral range from 0.32.2m in four bands. The identical fields of view in the four instrument bands are each split in three subfields: imaging, objective prism slitless for the field and objective prism single object slit low resolution spectroscopy, and high resolution long slit on single object. This allows the instrument, to do simultaneous broadband photometry or spectroscopy of the same object over the full spectral range, thus greatly improving the efficiency of the observatory and its detection limits. A prompt follow up (within three minutes) of the transient discovered by the EXIST makes IRT a unique tool for detection and study of these events, which is particularly valuable at wavelengths unavailable to the ground based observatories.

  16. The Body Model Theory of Somatosensory Cortex. (United States)

    Brecht, Michael


    I outline a microcircuit theory of somatosensory cortex as a body model serving both for body representation and "body simulation." A modular model of innervated and non-innervated body parts resides in somatosensory cortical layer 4. This body model is continuously updated and compares to an avatar (an animatable puppet) rather than a mere sensory map. Superficial layers provide context and store sensory memories, whereas layer 5 provides motor output and stores motor memories. I predict that layer-6-to-layer-4 inputs initiate body simulations allowing rehearsal and risk assessment of difficult actions, such as jumps. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Statistical test theory for the behavioral sciences

    CERN Document Server

    de Gruijter, Dato N M


    Since the development of the first intelligence test in the early 20th century, educational and psychological tests have become important measurement techniques to quantify human behavior. Focusing on this ubiquitous yet fruitful area of research, Statistical Test Theory for the Behavioral Sciences provides both a broad overview and a critical survey of assorted testing theories and models used in psychology, education, and other behavioral science fields. Following a logical progression from basic concepts to more advanced topics, the book first explains classical test theory, covering true score, measurement error, and reliability. It then presents generalizability theory, which provides a framework to deal with various aspects of test scores. In addition, the authors discuss the concept of validity in testing, offering a strategy for evidence-based validity. In the two chapters devoted to item response theory (IRT), the book explores item response models, such as the Rasch model, and applications, incl...

  18. Topos models for physics and topos theory

    Energy Technology Data Exchange (ETDEWEB)

    Wolters, Sander, E-mail: [Radboud Universiteit Nijmegen, Institute for Mathematics, Astrophysics, and Particle Physics (Netherlands)


    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  19. A Test of the Need Hierarchy Concept by a Markov Model of Change in Need Strength. (United States)

    Rauschenberger, John; And Others


    In this study of 547 high school graduates, Alderfer's and Maslow's need hierarchy theories were expressed in Markov chain form and were subjected to empirical test. Both models were disconfirmed. Corroborative multiwave correlational analysis also failed to support the need hierarchy concept. (Author/IRT)

  20. Sparse modeling theory, algorithms, and applications

    CERN Document Server

    Rish, Irina


    ""A comprehensive, clear, and well-articulated book on sparse modeling. This book will stand as a prime reference to the research community for many years to come.""-Ricardo Vilalta, Department of Computer Science, University of Houston""This book provides a modern introduction to sparse methods for machine learning and signal processing, with a comprehensive treatment of both theory and algorithms. Sparse Modeling is an ideal book for a first-year graduate course.""-Francis Bach, INRIA - École Normale Supřieure, Paris

  1. Temperature characteristics modeling of Preisach theory

    Directory of Open Access Journals (Sweden)

    Chen Hao


    Full Text Available This paper proposes a modeling method of the temperature characteristics of Preisach theory. On the basis of the classical Preisach hysteresis model, the Curie temperature, the critical exponent and the ambient temperature are introduced after which the effect of temperature on the magnetic properties of ferromagnetic materials can be accurately reflected. A simulation analysis and a temperature characteristic experiment with silicon steel was carried out. The results are basically the same which proves the validity and the accuracy of the method.

  2. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.


    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  3. A Mixture Rasch Model with a Covariate: A Simulation Study via Bayesian Markov Chain Monte Carlo Estimation (United States)

    Dai, Yunyun


    Mixtures of item response theory (IRT) models have been proposed as a technique to explore response patterns in test data related to cognitive strategies, instructional sensitivity, and differential item functioning (DIF). Estimation proves challenging due to difficulties in identification and questions of effect size needed to recover underlying…

  4. Investigating Effect of Ignoring Hierarchical Data Structures on Accuracy of Vertical Scaling Using Mixed-Effects Rasch Model (United States)

    Wang, Shudong; Jiao, Hong; Jin, Ying; Thum, Yeow Meng


    The vertical scales of large-scale achievement tests created by using item response theory (IRT) models are mostly based on cluster (or correlated) educational data in which students usually are clustered in certain groups or settings (classrooms or schools). While such application directly violated assumption of independent sample of person in…

  5. An Aggregate IRT Procedure for Exploratory Factor Analysis (United States)

    Camilli, Gregory; Fox, Jean-Paul


    An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the stochastic approximation expectation maximization…

  6. An Aggregate IRT Procedure for Exploratory Factor Analysis

    NARCIS (Netherlands)

    Camilli, Gregory; Fox, Gerardus J.A.


    An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the

  7. Network Security Risk Assessment Based on Item Response Theory

    Directory of Open Access Journals (Sweden)

    Fangwei Li


    Full Text Available Owing to the traditional risk assessment method has one-sidedness and is difficult to reflect the real network situation, a risk assessment method based on Item Response Theory (IRT is put forward in network security. First of all, the novel algorithms of calculating the threat of attack and the successful probability of attack are proposed by the combination of IRT model and Service Security Level. Secondly, the service weight of importance is calculated by the three-demarcation analytic hierarchy process. Finally, the risk situation graph of service, host and network logic layer could be generated by the improved method. The simulation results show that this method can be more comprehensive consideration of factors which are affecting network security, and a more realistic network risk situation graph in real-time will be obtained.

  8. Standard Model as a Double Field Theory. (United States)

    Choi, Kang-Sin; Park, Jeong-Hyuck


    We show that, without any extra physical degree introduced, the standard model can be readily reformulated as a double field theory. Consequently, the standard model can couple to an arbitrary stringy gravitational background in an O(4,4) T-duality covariant manner and manifest two independent local Lorentz symmetries, Spin(1,3)×Spin(3,1). While the diagonal gauge fixing of the twofold spin groups leads to the conventional formulation on the flat Minkowskian background, the enhanced symmetry makes the standard model more rigid, and also stringy, than it appeared. The CP violating θ term may no longer be allowed by the symmetry, and hence the strong CP problem can be solved. There are now stronger constraints imposed on the possible higher order corrections. We speculate that the quarks and the leptons may belong to the two different spin classes.

  9. Linear sigma model for multiflavor gauge theories (United States)

    Meurice, Y.


    We consider a linear sigma model describing 2 Nf2 bosons (σ , a0 , η' and π ) as an approximate effective theory for a S U (3 ) local gauge theory with Nf Dirac fermions in the fundamental representation. The model has a renormalizable U (Nf)L⊗U (Nf)R invariant part, which has an approximate O (2 Nf2) symmetry, and two additional terms, one describing the effects of a S U (Nf)V invariant mass term and the other the effects of the axial anomaly. We calculate the spectrum for arbitrary Nf. Using preliminary and published lattice results from the LatKMI collaboration, we found combinations of the masses that vary slowly with the explicit chiral symmetry breaking and Nf. This suggests that the anomaly term plays a leading role in the mass spectrum and that simple formulas such as Mσ2≃(2 /Nf-Cσ)Mη' 2 should apply in the chiral limit. Lattice measurements of Mη'2 and of approximate constants such as Cσ could help in locating the boundary of the conformal window. We show that our calculation can be adapted for arbitrary representations of the gauge group and in particular to the minimal model with two sextets, where similar patterns are likely to apply.

  10. Standard Models from Heterotic M-theory

    CERN Document Server

    Donagi, R Y; Pantev, T; Waldram, D; Donagi, Ron; Ovrut, Burt A.; Pantev, Tony; Waldram, Daniel


    We present a class of N=1 supersymmetric models of particle physics, derived directly from heterotic M-theory, that contain three families of chiral quarks and leptons coupled to the gauge group $SU(3)_C\\times SU(2)_{L}\\times U(1)_{Y}$. These models are a fundamental form of ``brane-world'' theories, with an observable and hidden sector each confined, after compactification on a Calabi-Yau threefold, to a BPS three-brane separated by a five dimensional bulk space with size of the order of the intermediate scale. The requirement of three families, coupled to the fundamental conditions of anomaly freedom and supersymmetry, constrains these models to contain additional five-branes wrapped around holomorphic curves in the Calabi-Yau threefold. These five-branes ``live'' in the bulk space and represent new, non-perturbative aspects of these particle physics vacua. We discuss, in detail, the relevant mathematical structure of a class of torus-fibered Calabi-Yau threefolds with non-trivial first homotopy groups and ...

  11. A matrix model from string field theory

    Directory of Open Access Journals (Sweden)

    Syoji Zeze


    Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.

  12. Using chemical organization theory for model checking. (United States)

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter


    The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. All data and a JAVA applet to check SBML-models is available from Supplementary data are available at Bioinformatics online.

  13. Calibrating the Medical Council of Canada’s Qualifying Examination Part I using an integrated item response theory framework: a comparison of models and designs

    Directory of Open Access Journals (Sweden)

    Andre F. De Champlain


    Full Text Available Purpose: The aim of this research was to compare different methods of calibrating multiple choice question (MCQ and clinical decision making (CDM components for the Medical Council of Canada’s Qualifying Examination Part I (MCCQEI based on item response theory. Methods: Our data consisted of test results from 8,213 first time applicants to MCCQEI in spring and fall 2010 and 2011 test administrations. The data set contained several thousand multiple choice items and several hundred CDM cases. Four dichotomous calibrations were run using BILOG-MG 3.0. All 3 mixed item format (dichotomous MCQ responses and polytomous CDM case scores calibrations were conducted using PARSCALE 4. Results: The 2-PL model had identical numbers of items with chi-square values at or below a Type I error rate of 0.01 (83/3,499 or 0.02. In all 3 polytomous models, whether the MCQs were either anchored or concurrently run with the CDM cases, results suggest very poor fit. All IRT abilities estimated from dichotomous calibration designs correlated very highly with each other. IRT-based pass-fail rates were extremely similar, not only across calibration designs and methods, but also with regard to the actual reported decision to candidates. The largest difference noted in pass rates was 4.78%, which occurred between the mixed format concurrent 2-PL graded response model (pass rate= 80.43% and the dichotomous anchored 1-PL calibrations (pass rate= 85.21%. Conclusion: Simpler calibration designs with dichotomized items should be implemented. The dichotomous calibrations provided better fit of the item response matrix than more complex, polytomous calibrations.

  14. mirt: A Multidimensional Item Response Theory Package for the R Environment

    Directory of Open Access Journals (Sweden)

    R. Philip Chalmers


    Full Text Available Item response theory (IRT is widely used in assessment and evaluation research to explain how participants respond to item level stimuli. Several R packages can be used to estimate the parameters in various IRT models, the most flexible being the ltm (Rizopoulos 2006, eRm (Mair and Hatzinger 2007, and MCMCpack (Martin, Quinn, and Park 2011 packages. However these packages have limitations in that ltm and eRm can only analyze unidimensional IRT models effectively and the exploratory multidimensional extensions available in MCMCpack requires prior understanding of Bayesian estimation convergence diagnostics and are computationally intensive. Most importantly, multidimensional confirmatory item factor analysis methods have not been implemented in any R package.The mirt package was created for estimating multidimensional item response theory parameters for exploratory and confirmatory models by using maximum-likelihood meth- ods. The Gauss-Hermite quadrature method used in traditional EM estimation (e.g., Bock and Aitkin 1981 is presented for exploratory item response models as well as for confirmatory bifactor models (Gibbons and Hedeker 1992. Exploratory and confirmatory models are estimated by a stochastic algorithm described by Cai (2010a,b. Various program comparisons are presented and future directions for the package are discussed.

  15. Quality of life in the Danish general population--normative data and validity of WHOQOL-BREF using Rasch and item response theory models

    DEFF Research Database (Denmark)

    Noerholm, V; Groenvold, M; Watt, T


    BACKGROUND: The main objective of this study was to investigate the construct validity of the WHOQOL-BREF by use of Rasch and Item Response Theory models and to examine the stability of the model across high/low scoring individuals, gender, education, and depressive illness. Furthermore, the obje......BACKGROUND: The main objective of this study was to investigate the construct validity of the WHOQOL-BREF by use of Rasch and Item Response Theory models and to examine the stability of the model across high/low scoring individuals, gender, education, and depressive illness. Furthermore...... population. The response rate was 68.5%, and the sample reported here contained 1101 respondents: 578 women and 519 men (four respondents did not indicate their genders). RESULTS: Each of the four domains of the WHOQOL-BREF scale fitted a two-parameter IRT model, but did not fit the Rasch model. Due...... to multidimensionality, the total score of 26 items fitted neither model. Regression analysis was carried out, showing a level of explained variance of between 10 and 14%. The mean scores of the WHOQOL-BREF are reported as normative data for the general Danish population. CONCLUSION: The profile of the four WHOQOL...

  16. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter


    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  17. What does the beck depression inventory measure in myocardial infarction patients? a psychometric approach using item response theory and person-fit. (United States)

    Wardenaar, Klaas J; Wanders, Rob B K; Roest, Annelieke M; Meijer, Rob R; De Jonge, Peter


    Observed associations between depression following myocardial infarction (MI) and adverse cardiac outcomes could be overestimated due to patients' tendency to over report somatic depressive symptoms. This study was aimed to investigate this issue with modern psychometrics, using item response theory (IRT) and person-fit statistics to investigate if the Beck Depression Inventory (BDI) measures depression or something else among MI-patients. An IRT-model was fit to BDI-data of 1135 MI patients. Patients' adherence to this IRT-model was investigated with person-fit statistics. Subgroups of "atypical" (low person-fit) and "prototypical" (high person-fit) responders were identified and compared in terms of item-response patterns, psychiatric diagnoses, socio-demographics and somatic factors. In the IRT model, somatic items had lower thresholds compared to depressive mood/cognition items. Empirically identified "atypical" responders (n = 113) had more depressive mood/cognitions, scored lower on somatic items and more often had a Comprehensive International Diagnostic Interview (CIDI) depressive diagnosis than "prototypical" responders (n = 147). Additionally, "atypical" responders were younger and more likely to smoke. In conclusion, the BDI measures somatic symptoms in most MI patients, but measures depression in a subgroup of patients with atypical response patterns. The presented approach to account for interpersonal differences in item responding could help improve the validity of depression assessments in somatic patients. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás


    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  19. Conformal Field Theory Applied to Loop Models (United States)

    Jacobsen, Jesper Lykke

    The application of methods of quantum field theory to problems of statistical mechanics can in some sense be traced back to Onsager's 1944 solution [1] of the two-dimensional Ising model. It does however appear fair to state that the 1970's witnessed a real gain of momentum for this approach, when Wilson's ideas on scale invariance [2] were applied to study critical phenomena, in the form of the celebrated renormalisation group [3]. In particular, the so-called ɛ expansion permitted the systematic calculation of critical exponents [4], as formal power series in the space dimensionality d, below the upper critical dimension d c . An important lesson of these efforts was that critical exponents often do not depend on the precise details of the microscopic interactions, leading to the notion of a restricted number of distinct universality classes.

  20. The multi-dimensional model of Māori identity and cultural engagement: item response theory analysis of scale properties. (United States)

    Sibley, Chris G; Houkamau, Carla A


    We argue that there is a need for culture-specific measures of identity that delineate the factors that most make sense for specific cultural groups. One such measure, recently developed specifically for Māori peoples, is the Multi-Dimensional Model of Māori Identity and Cultural Engagement (MMM-ICE). Māori are the indigenous peoples of New Zealand. The MMM-ICE is a 6-factor measure that assesses the following aspects of identity and cultural engagement as Māori: (a) group membership evaluation, (b) socio-political consciousness, (c) cultural efficacy and active identity engagement, (d) spirituality, (e) interdependent self-concept, and (f) authenticity beliefs. This article examines the scale properties of the MMM-ICE using item response theory (IRT) analysis in a sample of 492 Māori. The MMM-ICE subscales showed reasonably even levels of measurement precision across the latent trait range. Analysis of age (cohort) effects further indicated that most aspects of Māori identification tended to be higher among older Māori, and these cohort effects were similar for both men and women. This study provides novel support for the reliability and measurement precision of the MMM-ICE. The study also provides a first step in exploring change and stability in Māori identity across the life span. A copy of the scale, along with recommendations for scale scoring, is included.

  1. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar


    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  2. Modeling Poker Challenges by Evolutionary Game Theory

    Directory of Open Access Journals (Sweden)

    Marco Alberto Javarone


    Full Text Available We introduce a model for studying the evolutionary dynamics of Poker. Notably, despite its wide diffusion and the raised scientific interest around it, Poker still represents an open challenge. Recent attempts for uncovering its real nature, based on statistical physics, showed that Poker in some conditions can be considered as a skill game. In addition, preliminary investigations reported a neat difference between tournaments and ‘cash game’ challenges, i.e., between the two main configurations for playing Poker. Notably, these previous models analyzed populations composed of rational and irrational agents, identifying in the former those that play Poker by using a mathematical strategy, while in the latter those playing randomly. Remarkably, tournaments require very few rational agents to make Poker a skill game, while ‘cash game’ may require several rational agents for not being classified as gambling. In addition, when the agent interactions are based on the ‘cash game’ configuration, the population shows an interesting bistable behavior that deserves further attention. In the proposed model, we aim to study the evolutionary dynamics of Poker by using the framework of Evolutionary Game Theory, in order to get further insights on its nature, and for better clarifying those points that remained open in the previous works (as the mentioned bistable behavior. In particular, we analyze the dynamics of an agent population composed of rational and irrational agents, that modify their behavior driven by two possible mechanisms: self-evaluation of the gained payoff, and social imitation. Results allow to identify a relation between the mechanisms for updating the agents’ behavior and the final equilibrium of the population. Moreover, the proposed model provides further details on the bistable behavior observed in the ‘cash game’ configuration.

  3. Immediate list recall as a measure of short-term episodic memory: insights from the serial position effect and item response theory. (United States)

    Gavett, Brandon E; Horwitz, Julie E


    The serial position effect shows that two interrelated cognitive processes underlie immediate recall of a supraspan word list. The current study used item response theory (IRT) methods to determine whether the serial position effect poses a threat to the construct validity of immediate list recall as a measure of verbal episodic memory. Archival data were obtained from a national sample of 4,212 volunteers aged 28-84 in the Midlife Development in the United States study. Telephone assessment yielded item-level data for a single immediate recall trial of the Rey Auditory Verbal Learning Test (RAVLT). Two parameter logistic IRT procedures were used to estimate item parameters and the Q(1) statistic was used to evaluate item fit. A two-dimensional model better fit the data than a unidimensional model, supporting the notion that list recall is influenced by two underlying cognitive processes. IRT analyses revealed that 4 of the 15 RAVLT items (1, 12, 14, and 15) were misfit (p measure of episodic memory and may provide misleading results. IRT methods can ameliorate these problems and improve construct validity.

  4. ASCAL: A Microcomputer Program for Estimating Logistic IRT Item Parameters. (United States)

    Vale, C. David; Gialluca, Kathleen A.

    ASCAL is a microcomputer-based program for calibrating items according to the three-parameter logistic model of item response theory. It uses a modified multivariate Newton-Raphson procedure for estimating item parameters. This study evaluated this procedure using Monte Carlo Simulation Techniques. The current version of ASCAL was then compared to…

  5. Measuring organizational effectiveness in information and communication technology companies using item response theory. (United States)

    Trierweiller, Andréa Cristina; Peixe, Blênio César Severo; Tezza, Rafael; Pereira, Vera Lúcia Duarte do Valle; Pacheco, Waldemar; Bornia, Antonio Cezar; de Andrade, Dalton Francisco


    The aim of this paper is to measure the effectiveness of the organizations Information and Communication Technology (ICT) from the point of view of the manager, using Item Response Theory (IRT). There is a need to verify the effectiveness of these organizations which are normally associated to complex, dynamic, and competitive environments. In academic literature, there is disagreement surrounding the concept of organizational effectiveness and its measurement. A construct was elaborated based on dimensions of effectiveness towards the construction of the items of the questionnaire which submitted to specialists for evaluation. It demonstrated itself to be viable in measuring organizational effectiveness of ICT companies under the point of view of a manager through using Two-Parameter Logistic Model (2PLM) of the IRT. This modeling permits us to evaluate the quality and property of each item placed within a single scale: items and respondents, which is not possible when using other similar tools.

  6. Theory and Modeling in Support of Tether (United States)

    Chang, C. L.; Bergeron, G.; Drobot, A. D.; Papadopoulos, K.; Riyopoulos, S.; Szuszczewicz, E.


    This final report summarizes the work performed by SAIC's Applied Physics Operation on the modeling and support of Tethered Satellite System missions (TSS-1 and TSS-1R). The SAIC team, known to be Theory and Modeling in Support of Tether (TMST) investigation, was one of the original twelve teams selected in July, 1985 for the first TSS mission. The accomplishments described in this report cover the period December 19, 1985 to September 31, 1999 and are the result of a continuous effort aimed at supporting the TSS missions in the following major areas. During the contract period, the SAIC's TMST investigation acted to: Participate in the planning and the execution on both of the TSS missions; Provide scientific understanding on the issues involved in the electrodynamic tether system operation prior to the TSS missions; Predict ionospheric conditions encountered during the re-flight mission (TSS-lR) based on realtime global ionosounde data; Perform post mission analyses to enhance our understanding on the TSS results. Specifically, we have 1) constructed and improved current collection models and enhanced our understanding on the current-voltage data; 2) investigated the effects of neutral gas in the current collection processes; 3) conducted laboratory experiments to study the discharge phenomena during and after tether-break; and 4) perform numerical simulations to understand data collected by plasma instruments SPES onboard the TSS satellite; Design and produce multi-media CD that highlights TSS mission achievements and convey the knowledge of the tether technology to the general public. Along with discussions of this work, a list of publications and presentations derived from the TMST investigation spanning the reporting period is compiled.

  7. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory (United States)

    Gopnik, Alison; Wellman, Henry M.


    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  8. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.


    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory....

  9. Catastrophe Theory: A Unified Model for Educational Change. (United States)

    Cryer, Patricia; Elton, Lewis


    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  10. A Leadership Identity Development Model: Applications from a Grounded Theory (United States)

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.


    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  11. The big bang theory and Universe modeling. Mistakes in the relativity theory


    Javadov, Khaladdin; Javadli, Elmaddin


    This article is about Theory of Big Bang and it describes some details of Universe Modelling. It is Physical and Mathematical modeling of Universe formation. Application of mathematical and physical formulas for Universe Calculations.

  12. Theory and modeling of active brazing.

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.


    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  13. Structure and Measurement of Depression in Youths: Applying Item Response Theory to Clinical Data (United States)

    Cole, David A.; Cai, Li; Martin, Nina C.; Findling, Robert L.; Youngstrom, Eric A.; Garber, Judy; Curry, John F.; Hyde, Janet S.; Essex, Marilyn J.; Compas, Bruce E.; Goodyer, Ian M.; Rohde, Paul; Stark, Kevin D.; Slattery, Marcia J.; Forehand, Rex


    Our goals in this article were to use item response theory (IRT) to assess the relation of depressive symptoms to the underlying dimension of depression and to demonstrate how IRT-based measurement strategies can yield more reliable data about depression severity than conventional symptom counts. Participants were 3,403 children and adolescents…

  14. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco


    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  15. Item response theory analysis of the Lichtenberg Financial Decision Screening Scale. (United States)

    Teresi, Jeanne A; Ocepek-Welikson, Katja; Lichtenberg, Peter A


    The focus of these analyses was to examine the psychometric properties of the Lichtenberg Financial Decision Screening Scale (LFDSS). The purpose of the screen was to evaluate the decisional abilities and vulnerability to exploitation of older adults. Adults aged 60 and over were interviewed by social, legal, financial, or health services professionals who underwent in-person training on the administration and scoring of the scale. Professionals provided a rating of the decision-making abilities of the older adult. The analytic sample included 213 individuals with an average age of 76.9 (SD = 10.1). The majority (57%) were female. Data were analyzed using item response theory (IRT) methodology. The results supported the unidimensionality of the item set. Several IRT models were tested. Ten ordinal and binary items evidenced a slightly higher reliability estimate (0.85) than other versions and better coverage in terms of the range of reliable measurement across the continuum of financial incapacity.

  16. Electroweak theory and the Standard Model

    CERN Multimedia

    CERN. Geneva; Giudice, Gian Francesco


    There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.

  17. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan


    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  18. The logical foundations of scientific theories languages, structures, and models

    CERN Document Server

    Krause, Decio


    This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...

  19. A review of organizational buyer behaviour models and theories ...

    African Journals Online (AJOL)

    Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...

  20. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...

  1. Program evaluation models and related theories: AMEE guide no. 67. (United States)

    Frye, Ann W; Hemmer, Paul A


    This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the model's theoretical basis against their program's complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatrick's four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes-intended and unintended-associated with their programs.

  2. Non-static plane symmetric cosmological model in Wesson's theory

    Indian Academy of Sciences (India)

    ] scale invariant theory of gravitation with a time-dependent gauge function is investigated. The false vacuum model of the universe is constructed and some physical properties of the model are discussed.

  3. Rigid aleph_epsilon-saturated models of superstable theories


    Shami, Ziv; Shelah, Saharon


    In a countable superstable NDOP theory, the existence of a rigid aleph_epsilon-saturated model implies the existence of 2^lambda rigid aleph_epsilon-saturated models of power lambda for every lambda>2^{aleph_0}.

  4. The design and capabilities of the EXIST optical and infra-red telescope (IRT) (United States)

    Kutyrev, A. S.; Moseley, S. H.; Golisano, C.; Gong, Q.; Allen, B. T.; Gehrels, N.; Grindlay, J. E.; Hong, J. S.; Woodgate, B. E.


    The Infra-Red Telescope is a critical element of the EXIST (Energetic X-Ray Imaging Survey Telescope) observatory. The primary goal of the IRT is to obtain photometric and spectroscopic measurements of high redshift (>=6) gamma ray reaching to the epoque of reionization. The photometric and spectral capabilities of the IRT will allow to use GRB afterglow as probes of the composition and ionization state of the intergalactic medium of the young universe. A prompt follow up (within three minutes) of the transient discovered by the EXIST makes IRT a unique tool for detection and study of these events in the infrared and optical wavelength, which is particularly valuable at wavelengths unavailable to the ground based observatories. We present the results of the mission study development on the IRT as part of the EXIST observatory.

  5. The Design and Capabilities of the EXIST Optical and Infra-Red Telescope (IRT) (United States)

    Kutyrev, A S.; Moseley, S. H.; Golisano, C.; Gong, Q.; Allen, B. T.; Gehrels, N.; Grindlay, J. E.; Hong, J. S.; Woodgate, B. E.


    The Infra-Red Telescope is a critical element of the EXIST (Energetic X-Ray Imaging Survey Telescope) observatory. The primary goal of the IRT is to obtain photometric and spectroscopic measurements of high redshift (> or =6) gamma ray reaching to the epoque of reionization. The photometric and spectral capabilities of the IRT will allow to use GRB afterglow as probes of the composition and ionization state of the intergalactic medium of the young universe. A prompt follow up (within three minutes) of the transient discovered by the EXIST makes IRT a unique tool for detection and study of these events in the infrared and optical wavelength, which is particularly valuable at wavelengths unavailable to the ground based observatories. We present the results of the mission study development on the IRT as part of the EXIST observatory. Keywords: infrared spectroscopy, space telescope, gamma ray bursts, early universe

  6. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto


    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  7. Toric Methods in F-Theory Model Building

    Directory of Open Access Journals (Sweden)

    Johanna Knapp


    Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.

  8. Theories, models and urban realities. From New York to Kathmandu

    Directory of Open Access Journals (Sweden)

    Román Rodríguez González


    Full Text Available At the beginning of the 21st century, there are various social theories that speak of global changes in the history of human civilization. Urban models have been through obvious changes throughout the last century according to the important transformation that are pro-posed by previous general theories. Nevertheless global diversity contradicts the generaliza-tion of these theories and models. From our own simple observations and reflections we arrive at conclusions that distance themselves from the prevailing theory of our civilized world. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kath-mandu still have more internal differences than similarities.

  9. Theories, models and urban realities. From New York to Kathmandu

    Directory of Open Access Journals (Sweden)

    José Somoza Medina


    Full Text Available At the beginning of the 21st century, there are various social theories that speak of globalchanges in the history of human civilization. Urban models have been through obviouschanges throughout the last century according to the important transformation that are proposedby previous general theories. Nevertheless global diversity contradicts the generalizationof these theories and models. From our own simple observations and reflections wearrive at conclusions that distance themselves from the prevailing theory of our civilizedworld. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kathmandustill have more internal differences than similarities.

  10. Model-Based Learning: A Synthesis of Theory and Research (United States)

    Seel, Norbert M.


    This article provides a review of theoretical approaches to model-based learning and related research. In accordance with the definition of model-based learning as an acquisition and utilization of mental models by learners, the first section centers on mental model theory. In accordance with epistemology of modeling the issues of semantics,…

  11. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory (United States)

    Silva, Walter A.


    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  12. Dimensional reduction of Markov state models from renormalization group theory. (United States)

    Orioli, S; Faccioli, P


    Renormalization Group (RG) theory provides the theoretical framework to define rigorous effective theories, i.e., systematic low-resolution approximations of arbitrary microscopic models. Markov state models are shown to be rigorous effective theories for Molecular Dynamics (MD). Based on this fact, we use real space RG to vary the resolution of the stochastic model and define an algorithm for clustering microstates into macrostates. The result is a lower dimensional stochastic model which, by construction, provides the optimal coarse-grained Markovian representation of the system's relaxation kinetics. To illustrate and validate our theory, we analyze a number of test systems of increasing complexity, ranging from synthetic toy models to two realistic applications, built form all-atom MD simulations. The computational cost of computing the low-dimensional model remains affordable on a desktop computer even for thousands of microstates.

  13. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...

  14. Increasing the Number of Replications in Item Response Theory Simulations: Automation through SAS and Disk Operating System (United States)

    Gagne, Phill; Furlow, Carolyn; Ross, Terris


    In item response theory (IRT) simulation research, it is often necessary to use one software package for data generation and a second software package to conduct the IRT analysis. Because this can substantially slow down the simulation process, it is sometimes offered as a justification for using very few replications. This article provides…

  15. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.


    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  16. Theory and model use in social marketing health interventions. (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne


    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  17. An Introduction to the DA-T Gibbs Sampler for the Two-Parameter Logistic (2PL) Model and beyond (United States)

    Maris, Gunter; Bechger, Timo M.


    The DA-T Gibbs sampler is proposed by Maris and Maris (2002) as a Bayesian estimation method for a wide variety of "Item Response Theory (IRT) models". The present paper provides an expository account of the DA-T Gibbs sampler for the 2PL model. However, the scope is not limited to the 2PL model. It is demonstrated how the DA-T Gibbs…

  18. Measurement Models for Reasoned Action Theory


    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin


    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  19. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael


    -time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented......Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete...

  20. Posterior Predictive Model Checking for Multidimensionality in Item Response Theory (United States)

    Levy, Roy; Mislevy, Robert J.; Sinharay, Sandip


    If data exhibit multidimensionality, key conditional independence assumptions of unidimensional models do not hold. The current work pursues posterior predictive model checking, a flexible family of model-checking procedures, as a tool for criticizing models due to unaccounted for dimensions in the context of item response theory. Factors…

  1. Holomorphy without supersymmetry in the Standard Model Effective Field Theory

    Directory of Open Access Journals (Sweden)

    Rodrigo Alonso


    Full Text Available The anomalous dimensions of dimension-six operators in the Standard Model Effective Field Theory (SMEFT respect holomorphy to a large extent. The holomorphy conditions are reminiscent of supersymmetry, even though the SMEFT is not a supersymmetric theory.

  2. Reframing Leadership Pedagogy through Model and Theory Building. (United States)

    Mello, Jeffrey A.


    Leadership theories formed the basis of a course assignment with four objectives: understanding complex factors affecting leadership dynamics, developing abilities to assess organizational factors influencing leadership, practicing model and theory building, and viewing leadership from a multicultural perspective. The assignment was to develop a…

  3. Theory analysis of the Dental Hygiene Human Needs Conceptual Model. (United States)

    MacDonald, L; Bowen, D M


    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)


    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  5. Modeling Multivariate Volatility Processes: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Jelena Z. Minovic


    Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.

  6. A novel design of feedback control system for plasma horizontal position in IR-T1 tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Naghidokht, A.; Khodabakhsh, R. [Department of physics, Urmia University, Urmia (Iran, Islamic Republic of); Salar Elahi, A., E-mail: [Plasma Physics Research Center, Science and Research Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Ghoranneviss, M. [Plasma Physics Research Center, Science and Research Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of)


    Determination of accurate plasma horizontal position during plasma discharge is essential to transport it to a control system based on feedback. By using the plasma-circuits linearized model, Proportional Integral Derivative (PID) based controllers and a first order transfer function representing the power supply (PS) dynamics of vertical coil system for IR-T1 tokamak, we analyzed step feedback response of the overall system of IR-T1 tokamak and corresponding Bode diagrams for two cases with and without the plasma resistance and the eddy currents distribution. Also we did experiments for determination of plasma horizontal displacement in this tokamak. This work is done by four magnetic probes that are installed on the circular contour of the tokamak. This data used as input to the feedback controller to validate the performance of it. Results of feedback response analysis show that the controller has good performance. Due to approximations in the controller design, construction, installation and implementation of the controller is necessary and this is the purpose of our future works.

  7. An Evolutionary Game Theory Model of Spontaneous Brain Functioning

    National Research Council Canada - National Science Library

    Dario Madeo; Agostino Talarico; Alvaro Pascual-Leone; Chiara Mocenni; Emiliano Santarnecchi


    ... conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN...

  8. Neurocognitive networks: findings, models, and theory. (United States)

    Meehan, Timothy P; Bressler, Steven L


    Through its early history, cognitive neuroscience largely followed a modular paradigm wherein high-level cognitive functions were mapped onto locally segregated brain regions. However, recent evidence drives a continuing shift away from modular theories of cognitive brain function, and toward theories which hold that cognition arises from the integrated activity of large-scale, distributed networks of brain regions. A growing consensus favors the fundamental concept of this new paradigm: the large-scale cognitive brain network, or neurocognitive network. This consensus was the motivation for Neurocognitive Networks 2010 (NCN 2010), a conference sponsored by the Cognitive Neuroscience Program of the National Science Foundation, organized by Drs. Steven Bressler and Craig Richter of Florida Atlantic University (FAU), and held at FAU in Boca Raton, FL on January 29-30, 2010. NCN 2010 gathered together some of today's leading investigators of neurocognitive networks. This paper serves to review their presentations as they relate to the paradigm of neurocognitive networks, as well as to compile the emergent themes, questions, and possible future research directions that arose from the conference. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Conceptual development: an adaptive resonance theory model of polysemy (United States)

    Dunbar, George L.


    Adaptive Resonance Theory provides a model of pattern classification that addresses the plasticity--stability dilemma and allows a neural network to detect when to construct a new category without the assistance of a supervisor. We show that Adaptive Resonance Theory can be applied to the study of natural concept development. Specifically, a model is presented which is able to categorize different usages of a common noun and group the polysemous senses appropriately.

  10. Translating caring theory into practice: the Carolina Care Model. (United States)

    Tonges, Mary; Ray, Joel


    This article describes how one organization operationalized Swanson Caring Theory and changed practice to ensure consistently high standards of performance. The Carolina Care Model developed at the University of North Carolina Hospitals is designed to actualize caring theory, support practices that promote patient satisfaction, and transform cultural norms. Evaluation suggests that this approach to care delivery enhances patients' and families' hospital experience and facilitates desired outcomes. The authors outline the Professional Practice Model, key characteristics of Carolina Care, links to caring theory, and development and implementation methodologies.

  11. The Number of Atomic Models of Uncountable Theories


    Ulrich, Douglas


    We show there exists a complete theory in a language of size continuum possessing a unique atomic model which is not constructible. We also show it is consistent with $ZFC + \\aleph_1 < 2^{\\aleph_0}$ that there is a complete theory in a language of size $\\aleph_1$ possessing a unique atomic model which is not constructible. Finally we show it is consistent with $ZFC + \\aleph_1 < 2^{\\aleph_0}$ that for every complete theory $T$ in a language of size $\\aleph_1$, if $T$ has uncountable atomic mod...

  12. Bianchi class A models in Sàez-Ballester's theory (United States)

    Socorro, J.; Espinoza-García, Abraham


    We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.

  13. A Dynamic Systems Theory Model of Visual Perception Development (United States)

    Coté, Carol A.


    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  14. Measurement Models for Reasoned Action Theory. (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin


    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  15. Validation of a measure of knowledge about human papillomavirus (HPV) using item response theory and classical test theory. (United States)

    Waller, Jo; Ostini, Remo; Marlow, Laura A V; McCaffery, Kirsten; Zimet, Gregory


    Public understanding of HPV is important to ensure informed participation in cervical cancer prevention programmes. While many studies have measured HPV knowledge, none has developed a validated measure for use across countries. We aimed to develop and validate such a measure. Items tapping knowledge of HPV, HPV testing and HPV vaccination were developed from previous literature and with expert consultation. The 29-item measure was administered via the internet to 2409 adults in the UK, US and Australia in 2011. Classical test theory and item response theory were used to establish the measure's psychometric properties. Total scale reliability was very good (α = 0.838), as was internal consistency for a 16-item general HPV knowledge subset (α = 0.849). Subsets of HPV testing and vaccination items showed reasonable test-retest reliability (r(test-retest) = 0.62 and 0.69) but moderate internal consistency (α = 0.52 and 0.56). Dimensionality analyses suggested that one item was not measuring the same construct as the remainder of the questionnaire. A 2-parameter logistic item response theory (IRT) model was fitted to the remaining 28 scale items. A structurally coherent set of items covering a range of important HPV knowledge was developed. Responses indicated a reliable questionnaire, which allowed the fitting of an IRT model. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida


    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  17. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario


    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  18. Mathematical Modelling and New Theories of Learning. (United States)

    Boaler, Jo


    Demonstrates the importance of expanding notions of learning beyond knowledge to the practices in mathematics classrooms. Considers a three-year study of students who learned through mathematical modeling. Shows that a modeling approach encouraged the development of a range of important practices in addition to knowledge that were useful in real…

  19. Baldrige Theory into Practice: A Generic Model (United States)

    Arif, Mohammed


    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  20. Optimal transportation networks models and theory

    CERN Document Server

    Bernot, Marc; Morel, Jean-Michel


    The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.

  1. The minimal clinically important difference determined using item response theory models: an attempt to solve the issue of the association with baseline score. (United States)

    Rouquette, Alexandra; Blanchin, Myriam; Sébille, Véronique; Guillemin, Francis; Côté, Sylvana M; Falissard, Bruno; Hardouin, Jean-Benoit


    Determining the minimal clinically important difference (MCID) of questionnaires on an interval scale, the trait level (TL) scale, using item response theory (IRT) models could overcome its association with baseline severity. The aim of this study was to compare the sensitivity (Se), specificity (Sp), and predictive values (PVs) of the MCID determined on the score scale (MCID-Sc) or the TL scale (MCID-TL). The MCID-Sc and MCID-TL of the MOS-SF36 general health subscale were determined for deterioration and improvement on a cohort of 1,170 patients using an anchor-based method and a partial credit model. The Se, Sp, and PV were calculated using the global rating of change (the anchor) as the gold standard test. The MCID-Sc magnitude was smaller for improvement (1.58 points) than for deterioration (-7.91 points). The Se, Sp, and PV were similar for MCID-Sc and MCID-TL in both cases. However, if the MCID was defined on the score scale as a function of a range of baseline scores, its Se, Sp, and PV were consistently higher. This study reinforces the recommendations concerning the use of an MCID-Sc defined as a function of a range of baseline scores. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. A Non-Parametric Item Response Theory Evaluation of the CAGE Instrument Among Older Adults. (United States)

    Abdin, Edimansyah; Sagayadevan, Vathsala; Vaingankar, Janhavi Ajit; Picco, Louisa; Chong, Siow Ann; Subramaniam, Mythily


    The validity of the CAGE using item response theory (IRT) has not yet been examined in older adult population. This study aims to investigate the psychometric properties of the CAGE using both non-parametric and parametric IRT models, assess whether there is any differential item functioning (DIF) by age, gender and ethnicity and examine the measurement precision at the cut-off scores. We used data from the Well-being of the Singapore Elderly study to conduct Mokken scaling analysis (MSA), dichotomous Rasch and 2-parameter logistic IRT models. The measurement precision at the cut-off scores were evaluated using classification accuracy (CA) and classification consistency (CC). The MSA showed the overall scalability H index was 0.459, indicating a medium performing instrument. All items were found to be homogenous, measuring the same construct and able to discriminate well between respondents with high levels of the construct and the ones with lower levels. The item discrimination ranged from 1.07 to 6.73 while the item difficulty ranged from 0.33 to 2.80. Significant DIF was found for 2-item across ethnic group. More than 90% (CC and CA ranged from 92.5% to 94.3%) of the respondents were consistently and accurately classified by the CAGE cut-off scores of 2 and 3. The current study provides new evidence on the validity of the CAGE from the IRT perspective. This study provides valuable information of each item in the assessment of the overall severity of alcohol problem and the precision of the cut-off scores in older adult population.

  3. Conformal Field Theory and its application to the Ising model (United States)

    Meyer, Joshua

    The two-dimensional Ising model was originally solved by Onsager using statistical physics techniques. More recently, it has been found that the derivation of critical exponents and correlation functions can be greatly simplified by using the methods of Conformal Field Theory (CFT). We review these methods and apply them to the two-dimensional Ising model. The connection between the continuum limit Ising model and the field theory of free fermions is explained, resulting in a CFT on the plane with two non-trivial fields. Through the use of bosonization on the plane, the free-field correlation functions of the model are computed.

  4. Mixed models theory and applications with R

    CERN Document Server

    Demidenko, Eugene


    Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g

  5. Homogeneous cosmological models in Yang's gravitation theory (United States)

    Fennelly, A. J.; Pavelle, R.


    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  6. Modeling workplace bullying using catastrophe theory. (United States)

    Escartin, J; Ceja, L; Navarro, J; Zapf, D


    Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.

  7. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory. (United States)

    Karabatsos, George


    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  8. Spatial interaction models facility location using game theory

    CERN Document Server

    D'Amato, Egidio; Pardalos, Panos


    Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.

  9. Contributions of modern measurement theory to measuring executive function in early childhood: An empirical demonstration. (United States)

    Willoughby, Michael T; Wirth, R J; Blair, Clancy B


    This study demonstrates the merits of evaluating a newly developed battery of executive function tasks, designed for use in early childhood, from the perspective of item response theory (IRT). The battery was included in the 48-month assessment of the Family Life Project, a prospective longitudinal study of 1292 children oversampled from low-income and African American families. IRT models were applied to a select set of tasks to demonstrate empirically (a) a principled method for item evaluation, including the utility of item characteristic curves; (b) how to explicitly test whether the measurement properties of executive function tasks are invariant across mutually exclusive subgroups of youths; (c) how the precision of measurement of a given task can vary according to underlying child ability; and (d) the utility of using IRT-based versus percentage correct scores. Results are discussed with respect to the importance of developing psychometrically sound and scalable instruments that facilitate the measurement of interindividual differences in intraindividual change of executive function across the early childhood period. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Cascade Version 1: Theory and Model Formulation (United States)


    that provides this modeling framework , potentially allowing for an arbitrary number of scales. The coupling between coastal evolution at different...breakpoint. The two equations are written as follows: 2 2cos coso go o b gb bH C H Cθ = θ (7) sin sino b o bC C θ θ= (8) where H = wave height

  11. Density functional theory and multiscale materials modeling

    Indian Academy of Sciences (India)

    One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids.

  12. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.


    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...

  13. Applying learning theories and instructional design models for effective instruction. (United States)

    Khalil, Mohammed K; Elkhider, Ihsan A


    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory. Copyright © 2016 The American Physiological Society.

  14. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva


    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  15. Multilevel Ventilation: Theory and Simplified Mathematical Model

    Directory of Open Access Journals (Sweden)

    P. Torok


    Full Text Available Considering the issues of artificial ventilation (AV in non-homogenous pathological lung processes (acute lung injury, acute respiratory distress syndrome, pneumonia, etc., the authors created a mathematical model of multicompartment non-homogenous injured lungs that were ventilated by a new mode of AV, the so-called three-level ventilation. Multilevel ventilation was defined a type (modification of ALV whose basic ventilation level was produced by the modes CMV, PCV or PS (ASB and add-on level, and the so-called background ventilation was generated by the levels of PEEP and high PEEP (PEEPh with varying frequency and duration. Multi-level ventilation on 3 pressure levels was realized by the mathematical model as a combination of pressure-controlled ventilation (PCV and two levels of PEEP and PEEPh. The objective was to prove that in cases of considerably non-homogenous gas distribution in acute pathological disorders of lungs, gas entry into the so-called slow bronchoalveolar compartments could be improved by multilevel AV, without substabtially changing the volume of so-called fast compartments. Material and Method. Multi-level ventilation at 3 pressure levels was realized by the mathematical model as a combination of PCV and two levels of PEEP and PEEPh. Results. By comparing the single-level AV in the PCV mode with the so-called three-level ventilation defined as a combination of PCV+PEEPh/PEEP, the authors have discovered that the loading of slow compartments in the model was considerably improved by 50—60% as compared with the baseline values. In absolute terms, this difference was as many as 2—10 times of the volume. Conclusion. The mathematical model may demonstrate that the application of the so-called three-level AV causes considerable changes in gas distribution in the lung parenchyma disordered by a non-homogenous pathological process. The authors state that the proposed mathematical model requires clinical verification in order

  16. From integrable models to gauge theories Festschrift Matinyan (Sergei G)

    CERN Document Server

    Gurzadyan, V G


    This collection of twenty articles in honor of the noted physicist and mentor Sergei Matinyan focuses on topics that are of fundamental importance to high-energy physics, field theory and cosmology. The topics range from integrable quantum field theories, three-dimensional Ising models, parton models and tests of the Standard Model, to black holes in loop quantum gravity, the cosmological constant and magnetic fields in cosmology. A pedagogical essay by Lev Okun concentrates on the problem of fundamental units. The articles have been written by well-known experts and are addressed to graduate

  17. Psychometric properties of the SDM-Q-9 questionnaire for shared decision-making in multiple sclerosis: item response theory modelling and confirmatory factor analysis. (United States)

    Ballesteros, Javier; Moral, Ester; Brieva, Luis; Ruiz-Beato, Elena; Prefasi, Daniel; Maurino, Jorge


    Shared decision-making is a cornerstone of patient-centred care. The 9-item Shared Decision-Making Questionnaire (SDM-Q-9) is a brief self-assessment tool for measuring patients' perceived level of involvement in decision-making related to their own treatment and care. Information related to the psychometric properties of the SDM-Q-9 for multiple sclerosis (MS) patients is limited. The objective of this study was to assess the performance of the items composing the SDM-Q-9 and its dimensional structure in patients with relapsing-remitting MS. A non-interventional, cross-sectional study in adult patients with relapsing-remitting MS was conducted in 17 MS units throughout Spain. A nonparametric item response theory (IRT) analysis was used to assess the latent construct and dimensional structure underlying the observed responses. A parametric IRT model, General Partial Credit Model, was fitted to obtain estimates of the relationship between the latent construct and item characteristics. The unidimensionality of the SDM-Q-9 instrument was assessed by confirmatory factor analysis. A total of 221 patients were studied (mean age = 42.1 ± 9.9 years, 68.3% female). Median Expanded Disability Status Scale score was 2.5 ± 1.5. Most patients reported taking part in each step of the decision-making process. Internal reliability of the instrument was high (Cronbach's α = 0.91) and the overall scale scalability score was 0.57, indicative of a strong scale. All items, except for the item 1, showed scalability indices higher than 0.30. Four items (items 6 through to 9) conveyed more than half of the SDM-Q-9 overall information (67.3%). The SDM-Q-9 was a good fit for a unidimensional latent structure (comparative fit index = 0.98, root-mean-square error of approximation = 0.07). All freely estimated parameters were statistically significant (P 0.40) with the exception of item 1 which presented the lowest loading (0.26). Items 6 through to 8 were the

  18. Lenses on reading an introduction to theories and models

    CERN Document Server

    Tracey, Diane H


    Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a

  19. Fuzzy Stochastic Optimization Theory, Models and Applications

    CERN Document Server

    Wang, Shuming


    Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies.   The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...

  20. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong


    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  1. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars


    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from, together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  2. Combining Theory Generation and Model Checking for Security Protocol Analysis, (United States)


    checking to the task of protocol analysis, while the other utilizes the method of theory generation. which borrows from both model checking and...This paper reviews two relatively new tools for automated formal analysis of security protocols. One applies the formal methods technique of model

  3. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing


    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  4. [Study of dental model testing tool based on robot theory]. (United States)

    Hu, B; Song, Y; Cheng, L


    A new three dimensional testing and analysing system of dental model is discussed It is designed based on the motion theory of robots. The system is capable of not only measuring the three dimensional sizes of dental models, but also saving and outputing the tested data. The construction of the system is briefly introduced here.

  5. Pilot evaluation in TENCompetence: a theory-driven model1

    NARCIS (Netherlands)

    Schoonenboom, J.; Sligte, H.; Moghnieh, A.; Specht, M.; Glahn, C.; Stefanov, K.; Navarrete, T.; Blat, J.


    This paper describes a theory-driven evaluation model that is used in evaluating four pilots in which an infrastructure for lifelong competence development, which is currently being developed, is validated. The model makes visible the separate implementation steps that connect the envisaged

  6. Clinical outcome measurement: Models, theory, psychometrics and practice. (United States)

    McClimans, Leah; Browne, John; Cano, Stefan

    In the last decade much has been made of the role that models play in the epistemology of measurement. Specifically, philosophers have been interested in the role of models in producing measurement outcomes. This discussion has proceeded largely within the context of the physical sciences, with notable exceptions considering measurement in economics. However, models also play a central role in the methods used to develop instruments that purport to quantify psychological phenomena. These methods fall under the umbrella term 'psychometrics'. In this paper, we focus on Clinical Outcome Assessments (COAs) and discuss two measurement theories and their associated models: Classical Test Theory (CTT) and Rasch Measurement Theory. We argue that models have an important role to play in coordinating theoretical terms with empirical content, but to do so they must serve: 1) as a representation of the measurement interaction; and 2) in conjunction with a theory of the attribute in which we are interested. We conclude that Rasch Measurement Theory is a more promising approach than CTT in these regards despite the latter's popularity with health outcomes researchers. Copyright © 2017. Published by Elsevier Ltd.

  7. A Mixture IRT Analysis of Risky Youth Behavior

    Directory of Open Access Journals (Sweden)

    Holmes eFinch


    Full Text Available The study reported in this manuscript used a mixture item response model with data from the Youth Risk Behavior Survey 2009 (N = 16,410 to identify subtypes of adolescents at-risk for engaging in unhealthy behaviors, and to find individual survey items that were most effective at identifying such students within each subtype. The goal of the manuscript is twofold: 1 To demonstrate the utility of the mixture item response theory model for identifying subgroups in the population and for highlighting the use of group specific item response parameters and 2 To identify typologies of adolescents based on their propensity for engaging in sexually and substance use risky behaviors. Results indicate that 4 classes of youth exist in the population, with differences in risky sexual behaviors and substance use. The first group had a greater propensity to engage in risky sexual behavior, while group 2 was more likely to smoke tobacco and drink alcohol. Group 3 was the most likely to use other substances, such as marijuana, methamphetamine, and other mind altering drugs, and group 4 had the lowest propensity for engaging in any of the sexual or substance use behaviors included in the survey. Finally, individual items were identified for each group that can be most effective at identifying individuals at greatest risk. Further proposed directions of research and the contribution of this analysis to the existing literature are discussed.

  8. Advances in cognitive theory and therapy: the generic cognitive model. (United States)

    Beck, Aaron T; Haigh, Emily A P


    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  9. Effective Biot theory and its generalization to poroviscoelastic models (United States)

    Liu, Xu; Greenhalgh, Stewart; Zhou, Bing; Greenhalgh, Mark


    A method is suggested to express the effective bulk modulus of the solid frame of a poroelastic material as a function of the saturated bulk modulus. This method enables effective Biot theory to be described through the use of seismic dispersion measurements or other models developed for the effective saturated bulk modulus. The effective Biot theory is generalized to a poroviscoelastic model of which the moduli are represented by the relaxation functions of the generalized fractional Zener model. The latter covers the general Zener and the Cole-Cole models as special cases. A global search method is described to determine the parameters of the relaxation functions, and a simple deterministic method is also developed to find the defining parameters of the single Cole-Cole model. These methods enable poroviscoelastic models to be constructed, which are based on measured seismic attenuation functions, and ensure that the model dispersion characteristics match the observations.

  10. Applications of Generalizability Theory and Their Relations to Classical Test Theory and Structural Equation Modeling. (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat


    Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Modeling molecular recognition: theory and application. (United States)

    Mardis, K; Luo, R; David, L; Potter, M; Glemza, A; Payne, G; Gilson, M K


    Abstract Efficient, reliable methods for calculating the binding affinities of noncovalent complexes would allow advances in a variety of areas such as drug discovery and separation science. We have recently described a method that accommodates significant physical detail while remaining fast enough for use in molecular design. This approach uses the predominant states method to compute free energies, an empirical force field, and an implicit solvation model based upon continuum electrostatics. We review applications of this method to systems ranging from small molecules to protein-ligand complexes.

  12. Genetic model compensation: Theory and applications (United States)

    Cruickshank, David Raymond


    The adaptive filtering algorithm known as Genetic Model Compensation (GMC) was originally presented in the author's Master's Thesis. The current work extends this earlier work. GMC uses a genetic algorithm to optimize filter process noise parameters in parallel with the estimation of the state and based only on the observational information available to the filter. The original stochastic state model underlying GMC was inherited from the antecedent, non-adaptive Dynamic Model Compensation (DMC) algorithm. The current work develops the stochastic state model from a linear system viewpoint, avoiding the simplifications and approximations of the earlier development, and establishes Riemann sums as unbiased estimators of the stochastic integrals which describe the evolution of the random state components. These are significant developments which provide GMC with a solid theoretical foundation. Orbit determination is the area of application in this work, and two types of problems are studied: real-time autonomous filtering using absolute GPS measurements and precise post-processed filtering using differential GPS measurements. The first type is studied in a satellite navigation simulation in which pseudorange and pseudorange rate measurements are processed by an Extended Kalman Filter which incorporates both DMC and GMC. Both estimators are initialized by a geometric point solution algorithm. Using measurements corrupted by simulated Selective Availability errors, GMC reduces mean RSS position error by 6.4 percent, reduces mean clock bias error by 46 percent, and displays a marked improvement in covariance consistency relative to DMC. To study the second type of problem, GMC is integrated with NASA Jet Propulsion Laboratory's Gipsy/Oasis-II (GOA-II) precision orbit determination program creating an adaptive version of GOA-II's Reduced Dynamic Tracking (RDT) process noise formulation. When run as a sequential estimator with GPS measurements from the TOPEX satellite and

  13. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory (United States)

    Hannah, David R.; Venkatachary, Ranga


    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  14. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory. (United States)

    Gopnik, Alison; Wellman, Henry M


    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  15. Localization landscape theory of disorder in semiconductors I: Theory and modeling


    Filoche, Marcel; Piccardo, Marco; Wu, Yuh-Renn; Li, Chi-Kang; Weisbuch, Claude; Mayboroda, Svitlana


    We present here a model of carrier distribution and transport in semiconductor alloys accounting for quantum localization effects in disordered materials. This model is based on the recent development of a mathematical theory of quantum localization which introduces for each type of carrier a spatial function called \\emph{localization landscape}. These landscapes allow us to predict the localization regions of electron and hole quantum states, their corresponding energies, and the local densi...

  16. M-theory model-building and proton stability

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J. [CERN, Geneva (Switzerland). Theory Div.; Faraggi, A.E. [Florida Univ., Gainesville, FL (United States). Inst. for Fundamental Theory; Nanopoulos, D.V. [Texas A and M Univ., College Station, TX (United States)]|[Houston Advanced Research Center, The Woodlands, TX (United States). Astroparticle Physics Group]|[Academy of Athens (Greece). Div. of Natural Sciences


    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z{sub 2} x Z{sub 2} orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  17. M-Theory Model-Building and Proton Stability

    CERN Document Server

    Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.


    We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  18. Relating Unidimensional IRT Parameters to a Multidimensional Response Space: A Review of Two Alternative Projection IRT Models for Scoring Subscales (United States)

    Kahraman, Nilufer; Thompson, Tony


    A practical concern for many existing tests is that subscore test lengths are too short to provide reliable and meaningful measurement. A possible method of improving the subscale reliability and validity would be to make use of collateral information provided by items from other subscales of the same test. To this end, the purpose of this article…

  19. Making sense of implementation theories, models and frameworks. (United States)

    Nilsen, Per


    Implementation science has progressed towards increased use of theoretical approaches to provide better understanding and explanation of how and why implementation succeeds or fails. The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection and application of relevant approaches in implementation research and practice and to foster cross-disciplinary dialogue among implementation researchers. Theoretical approaches used in implementation science have three overarching aims: describing and/or guiding the process of translating research into practice (process models); understanding and/or explaining what influences implementation outcomes (determinant frameworks, classic theories, implementation theories); and evaluating implementation (evaluation frameworks). This article proposes five categories of theoretical approaches to achieve three overarching aims. These categories are not always recognized as separate types of approaches in the literature. While there is overlap between some of the theories, models and frameworks, awareness of the differences is important to facilitate the selection of relevant approaches. Most determinant frameworks provide limited "how-to" support for carrying out implementation endeavours since the determinants usually are too generic to provide sufficient detail for guiding an implementation process. And while the relevance of addressing barriers and enablers to translating research into practice is mentioned in many process models, these models do not identify or systematically structure specific determinants associated with implementation success. Furthermore, process models recognize a temporal sequence of implementation endeavours, whereas determinant frameworks do not explicitly take a process perspective of implementation.

  20. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H


    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  1. Theory, modeling and simulation of superconducting qubits

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU


    We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high

  2. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan


    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  3. Traffic Games: Modeling Freeway Traffic with Game Theory. (United States)

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R


    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  4. Theory of positive disintegration as a model of adolescent development. (United States)

    Laycraft, Krystyna


    This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.

  5. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems. (United States)

    Tsai, Chung-Hung


    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  6. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems (United States)

    Tsai, Chung-Hung


    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  7. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai


    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  8. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory (United States)

    Gopnik, Alison; Wellman, Henry M.


    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  9. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures (United States)

    Atar, Burcu; Kamata, Akihito


    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  10. Post-Hoc IRT Equating of Previously Administered English Tests for Comparison of Test Scores (United States)

    Saida, Chisato; Hattori, Tamaki


    Despite growing concerns about declining scholastic abilities of Japanese students throughout Japan prior to the implementation of the revised Courses of Study in 2002, little empirical evidence was available at that time to support this perceived decline in academic performance. This research describes post-hoc IRT equating of previously…

  11. Theory, modeling, and integrated studies in the Arase (ERG) project (United States)

    Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa


    Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.

  12. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter


    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  13. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.


    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  14. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...

  15. Magnetized cosmological models in bimetric theory of gravitation

    Indian Academy of Sciences (India)

    Abstract. Bianchi type-III magnetized cosmological model when the field of gravitation is governed by either a perfect fluid or cosmic string is investigated in Rosen's [1] bimetric theory of gravitation. To complete determinate solution, the condition, viz., A = (BC)n, where n is a constant, between the metric potentials is used.

  16. Anisotropic cosmological models in f (R, T) theory of gravitation

    Indian Academy of Sciences (India)

    cally viable f (R) gravity model, which showed the unification of early time inflation and late time acceleration. Harko et al [13] developed f (R, T) modified theory of gravity, where the gravi- tational Lagrangian is given by an arbitrary function of the Ricci scalar R and the trace T of the energy–momentum tensor. It is to be noted ...

  17. Teaching Model Building to High School Students: Theory and Reality. (United States)

    Roberts, Nancy; Barclay, Tim


    Builds on a National Science Foundation (NSF) microcomputer based laboratory project to introduce system dynamics into the precollege setting. Focuses on providing students with powerful and investigatory theory building tools. Discusses developed hardware, software, and curriculum materials used to introduce model building and simulations into…

  18. A Model to Demonstrate the Place Theory of Hearing (United States)

    Ganesh, Gnanasenthil; Srinivasan, Venkata Subramanian; Krishnamurthi, Sarayu


    In this brief article, the authors discuss Georg von Békésy's experiments showing the existence of traveling waves in the basilar membrane and that maximal displacement of the traveling wave was determined by the frequency of the sound. The place theory of hearing equates the basilar membrane to a frequency analyzer. The model described in this…

  19. Multilevel Higher-Order Item Response Theory Models (United States)

    Huang, Hung-Yu; Wang, Wen-Chung


    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  20. SIMP model at NNLO in chiral perturbation theory

    DEFF Research Database (Denmark)

    Hansen, Martin Rasmus Lundquist; Langaeble, K.; Sannino, F.


    We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 to 2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles...

  1. Speech act theory in support of idealised warning models | Carstens ...

    African Journals Online (AJOL)

    ... subsuming lower level speech acts such as POINTING OUT/ALERTING, INFORMING and INSTRUCTING. Secondly, the model is used to analyse and evaluate actual warnings collected from information sheets for hair-dryers, indicating the heuristic value of combined insights from document design and speech act theory ...

  2. A Proposed Model of Jazz Theory Knowledge Acquisition (United States)

    Ciorba, Charles R.; Russell, Brian E.


    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  3. Conceptualizations of Creativity: Comparing Theories and Models of Giftedness (United States)

    Miller, Angie L.


    This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…

  4. Dimensions of Genocide: The Circumplex Model Meets Violentization Theory (United States)

    Winton, Mark A.


    The purpose of this study is to examine the use of Olson's (1995, 2000) family therapy based circumplex model and Athens' (1992, 1997, 2003) violentization theory in explaining genocide. The Rwandan genocide of 1994 is used as a case study. Published texts, including interviews with perpetrators, research reports, human rights reports, and court…

  5. Pilot evaluation in TENCompetence: a theory-driven model


    Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Specht, Marcus; Glahn, Christian; Stefanov, Krassen


    Schoonenboom, J., Sligte, H., Moghnieh, A., Specht, M., Glahn, C., & Stefanov, K. (2007). Pilot evaluation in TENCompetence: a theory-driven model. In T. Navarette, J. Blat & R. Koper (Eds.). Proceedings of the 3rd TENCompetence Open Workshop 'Current Research on IMS Learning Design and Lifelong Competence Development Infrastructures' (pp. 43-50). June, 21-22, 2007, Barcelona, Spain.

  6. Pilot evaluation in TENCompetence: a theory-driven model

    NARCIS (Netherlands)

    Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Specht, Marcus; Glahn, Christian; Stefanov, Krassen


    Schoonenboom, J., Sligte, H., Moghnieh, A., Specht, M., Glahn, C., & Stefanov, K. (2007). Pilot evaluation in TENCompetence: a theory-driven model. In T. Navarette, J. Blat & R. Koper (Eds.). Proceedings of the 3rd TENCompetence Open Workshop 'Current Research on IMS Learning Design and Lifelong

  7. The application of item response theory in developing and validating a shortened version of the Emirate Marital Satisfaction Scale. (United States)

    Dodeen, Hamzeh; Al-Darmaki, Fatima


    The aim of this study was to determine the feasibility of generating a shorter version of the Emirati Marital Satisfaction Scale (EMSS) using item response theory (IRT)-based methodology. The EMSS is the first national scale used to provide an understanding of the family function and level of marital satisfaction within the cultural context of the United Arab Emirates. A sample of 1,049 Emirati married individuals from different ages, genders, places of residence, and monthly incomes participated in this study. The IRT was calibrated using X-Calibre 4.2 and the graded response model. The analysis was developed on the basis of a short form of the EMSS (7 items), which constitutes a promising alternative to the original scale for practitioners and researchers. This short version is reliable, valid, and it gives results very similar to the original scale. The results of this study confirmed the usefulness of IRT-based methodology for developing psychological and counseling scales. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. A Psychometric Analysis of the Italian Version of the eHealth Literacy Scale Using Item Response and Classical Test Theory Methods. (United States)

    Diviani, Nicola; Dima, Alexandra Lelia; Schulz, Peter Johannes


    The eHealth Literacy Scale (eHEALS) is a tool to assess consumers' comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers' eHealth literacy.

  9. Excellence in Physics Education Award: Modeling Theory for Physics Instruction (United States)

    Hestenes, David


    All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.

  10. Alliance: A common factor of psychotherapy modeled by structural theory

    Directory of Open Access Journals (Sweden)

    Wolfgang eTschacher


    Full Text Available There is broad consensus that the therapeutic alliance constitutes a core common factor for all modalities of psychotherapy. Meta-analyses corroborated that alliance, as it emerges from therapeutic process, is a significant predictor of therapy outcome. Psychotherapy process is traditionally described and explored using two categorially different approaches, the experiential (first-person perspective and the behavioral (third-person perspective. We propose to add to this duality a third, structural approach. Dynamical systems theory and synergetics on the one hand and enactivist theory on the other together can provide this structural approach, which contributes in specific ways to a clarification of the alliance factor. Systems theory offers concepts and tools for the modeling of the individual self and, building on this, of alliance processes. In the enactive perspective, the self is conceived as a socially enacted autonomous system that strives to maintain identity by observing a two-fold goal: to exist as an individual self in its own right (distinction while also being open to others (participation. Using this conceptualization, we formalized the therapeutic alliance as a phase space whose potential minima (attractors can be shifted by the therapist to approximate therapy goals. This mathematical formalization is derived from probability theory and synergetics. Our conclusions say that structural theory provides powerful tools for the modeling of how therapeutic change is staged by the formation, utilization, and dissolution of the therapeutic alliance. In addition, we point out novel testable hypotheses and future applications.

  11. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang


    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  12. Accounting for Errors in Model Analysis Theory: A Numerical Approach (United States)

    Sommer, Steven R.; Lindell, Rebecca S.


    By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.

  13. Evaluation of MIMIC-Model Methods for DIF Testing With Comparison to Two-Group Analysis. (United States)

    Woods, Carol M


    Differential item functioning (DIF) occurs when an item on a test or questionnaire has different measurement properties for 1 group of people versus another, irrespective of mean differences on the construct. This study focuses on the use of multiple-indicator multiple-cause (MIMIC) structural equation models for DIF testing, parameterized as item response models. The accuracy of these methods, and the sample size requirements, are not well established. This study examines the accuracy of MIMIC methods for DIF testing when the focal group is small and compares results with those obtained using 2-group item response theory (IRT). Results support the utility of the MIMIC approach. With small focal-group samples, tests of uniform DIF with binary or 5-category ordinal responses were more accurate with MIMIC models than 2-group IRT. Recommendations are offered for the application of MIMIC methods for DIF testing.

  14. Cluster variational theory of spin ((3)/(2)) Ising models

    CERN Document Server

    Tucker, J W


    A cluster variational method for spin ((3)/(2)) Ising models on regular lattices is presented that leads to results that are exact for Bethe lattices of the same coordination number. The method is applied to both the Blume-Capel (BC) and the isotropic Blume-Emery-Griffiths model (BEG). In particular, the first-order phase line separating the two low-temperature ferromagnetic phases in the BC model, and the ferrimagnetic phase boundary in the BEG model are studied. Results are compared with those of other theories whose qualitative predictions have been in conflict.

  15. Educational measurement for applied researchers theory into practice

    CERN Document Server

    Wu, Margaret; Jen, Tsung-Hau


    This book is a valuable read for a diverse group of researchers and practitioners who analyze assessment data and construct test instruments. It focuses on the use of classical test theory (CTT) and item response theory (IRT), which are often required in the fields of psychology (e.g. for measuring psychological traits), health (e.g. for measuring the severity of disorders), and education (e.g. for measuring student performance), and makes these analytical tools accessible to a broader audience. Having taught assessment subjects to students from diverse backgrounds for a number of years, the three authors have a wealth of experience in presenting educational measurement topics, in-depth concepts and applications in an accessible format. As such, the book addresses the needs of readers who use CTT and IRT in their work but do not necessarily have an extensive mathematical background. The book also sheds light on common misconceptions in applying measurement models, and presents an integrated approach to differ...

  16. On ADE quiver models and F-theory compactification

    Energy Technology Data Exchange (ETDEWEB)

    Belhaj, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Rasmussen, J [Department of Mathematics and Statistics, University of Melbourne, Parkville, Victoria 3010 (Australia); Sebbar, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Sedra, M B [Laboratoire de Physique de la Matiere et Rayonnement (LPMR), Morocco Faculte des Sciences, Universite Ibn Tofail, Kenitra, Morocco (Morocco)


    Based on mirror symmetry, we discuss geometric engineering of N = 1 ADE quiver models from F-theory compactifications on elliptic K3 surfaces fibred over certain four-dimensional base spaces. The latter are constructed as intersecting 4-cycles according to ADE Dynkin diagrams, thereby mimicking the construction of Calabi-Yau threefolds used in geometric engineering in type II superstring theory. Matter is incorporated by considering D7-branes wrapping these 4-cycles. Using a geometric procedure referred to as folding, we discuss how the corresponding physics can be converted into a scenario with D5-branes wrapping 2-cycles of ALE spaces.

  17. Should the model for risk-informed regulation be game theory rather than decision theory? (United States)

    Bier, Vicki M; Lin, Shi-Woei


    deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.

  18. Performance of the biological rhythms interview for assessment in neuropsychiatry: An item response theory and actigraphy analysis. (United States)

    Allega, Olivia R; Leng, Xiamin; Vaccarino, Anthony; Skelly, Matthew; Lanzini, Mariana; Hidalgo, Maria Paz; Soares, Claudio N; Kennedy, Sidney H; Frey, Benicio N


    Biological rhythm disturbances are widely associated with the pathophysiology of mood disorders. The Biological Rhythms Interview for Assessment in Neuropsychiatry (BRIAN) is a self-report that indexes rhythm disturbance in sleep, activity, social and eating patterns. The aim of this study was to perform an Item Response Theory (IRT) analysis of the BRIAN and investigate its associations with objective sleep and rhythm disturbance measures. 103 subjects (31 bipolar, 32 major depression and 40 healthy volunteers) wore an actiwatch for fifteen days, and completed a first morning urine sample and the BRIAN on day 15. IRT analysis assessed individual BRIAN items and their relationship to total score. Individual actiwatch records were processed to produce a sequence of transitions between rest/activity, and a likelihood of transitioning between states was calculated to investigate sleep-wake dynamics. Cosinor analysis produced daily activity rhythms (DARs). Spearman correlations were used to assess the association between sleep/DAR variables and the BRIAN. IRT analyses showed that 11 of 18 BRIAN items displayed a high level of discrimination between item options across a range of BRIAN total scores. Total BRIAN score correlated with wake after sleep onset, total activity count during sleep, and urinary 6-sulphatoxymelatonin. BRIAN Activity domain correlated with the daytime transition probability from rest to activity. The sample size may have been underpowered for the graded-response model employed in IRT. The study lacked an objective comparison for BRIAN eating and social domain. The present study reveals the BRIAN displays promising external validity compared to objective parameters of circadian rhythmicity. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Finite Element and Plate Theory Modeling of Acoustic Emission Waveforms (United States)

    Prosser, W. H.; Hamstad, M. A.; Gary, J.; OGallagher, A.


    A comparison was made between two approaches to predict acoustic emission waveforms in thin plates. A normal mode solution method for Mindlin plate theory was used to predict the response of the flexural plate mode to a point source, step-function load, applied on the plate surface. The second approach used a dynamic finite element method to model the problem using equations of motion based on exact linear elasticity. Calculations were made using properties for both isotropic (aluminum) and anisotropic (unidirectional graphite/epoxy composite) materials. For simulations of anisotropic plates, propagation along multiple directions was evaluated. In general, agreement between the two theoretical approaches was good. Discrepancies in the waveforms at longer times were caused by differences in reflections from the lateral plate boundaries. These differences resulted from the fact that the two methods used different boundary conditions. At shorter times in the signals, before reflections, the slight discrepancies in the waveforms were attributed to limitations of Mindlin plate theory, which is an approximate plate theory. The advantages of the finite element method are that it used the exact linear elasticity solutions, and that it can be used to model real source conditions and complicated, finite specimen geometries as well as thick plates. These advantages come at a cost of increased computational difficulty, requiring lengthy calculations on workstations or supercomputers. The Mindlin plate theory solutions, meanwhile, can be quickly generated on personal computers. Specimens with finite geometry can also be modeled. However, only limited simple geometries such as circular or rectangular plates can easily be accommodated with the normal mode solution technique. Likewise, very limited source configurations can be modeled and plate theory is applicable only to thin plates.

  20. Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models

    Directory of Open Access Journals (Sweden)

    Hugo U. R. Strand


    Full Text Available We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium “phase diagrams” that map out the different dynamical regimes.

  1. Super Yang-Mills theory as a random matrix model

    Energy Technology Data Exchange (ETDEWEB)

    Siegel, W. [Institute for Theoretical Physics, State University of New York, Stony Brook, New York 11794-3840 (United States)


    We generalize the Gervais-Neveu gauge to four-dimensional {ital N}=1 superspace. The model describes an {ital N}=2 super Yang-Mills theory. All chiral superfields ({ital N}=2 matter and ghost multiplets) exactly cancel to all loops. The remaining Hermitian scalar superfield (matrix) has a renormalizable massive propagator and simplified vertices. These properties are associated with {ital N}=1 supergraphs describing a superstring theory on a random lattice world sheet. We also consider all possible finite matrix models, and find they have a universal large-color limit. These could describe gravitational strings if the matrix-model coupling is fixed to unity, for exact electric-magnetic self-duality.

  2. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H


    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  3. Localization landscape theory of disorder in semiconductors. I. Theory and modeling (United States)

    Filoche, Marcel; Piccardo, Marco; Wu, Yuh-Renn; Li, Chi-Kang; Weisbuch, Claude; Mayboroda, Svitlana


    We present here a model of carrier distribution and transport in semiconductor alloys accounting for quantum localization effects in disordered materials. This model is based on the recent development of a mathematical theory of quantum localization which introduces for each type of carrier a spatial function called localization landscape. These landscapes allow us to predict the localization regions of electron and hole quantum states, their corresponding energies, and the local densities of states. We show how the various outputs of these landscapes can be directly implemented into a drift-diffusion model of carrier transport and into the calculation of absorption/emission transitions. This creates a new computational model which accounts for disorder localization effects while also capturing two major effects of quantum mechanics, namely, the reduction of barrier height (tunneling effect) and the raising of energy ground states (quantum confinement effect), without having to solve the Schrödinger equation. Finally, this model is applied to several one-dimensional structures such as single quantum wells, ordered and disordered superlattices, or multiquantum wells, where comparisons with exact Schrödinger calculations demonstrate the excellent accuracy of the approximation provided by the landscape theory.

  4. Integration of mathematical models in marketing theory and practice

    Directory of Open Access Journals (Sweden)

    Ioana Olariu


    Full Text Available This article is a theoretical approach on the main mathematical models used in marketing practice. Application of general systems theory in marketing involves setting behavior assumptions as models of various processes.These models have, on the one hand, to describe the interactions between ambiance and system factors, and, secondly, to identify causal dependencies existing in these interactions.Since the models are the means by which possible solutions can be drawn consequences, they occupy a central role in the design of a system to solve a marketing problem.The model is a simplified representation which is described and conceptualized phenomena and real life situations.The purpose of a model is to facilitate understanding of the real system. Models are widely used in marketing, it takes different forms that facilitate understanding the realities of marketing.

  5. Estimation of a four-parameter item response theory model. (United States)

    Loken, Eric; Rulison, Kelly L


    We explore the justification and formulation of a four-parameter item response theory model (4PM) and employ a Bayesian approach to recover successfully parameter estimates for items and respondents. For data generated using a 4PM item response model, overall fit is improved when using the 4PM rather than the 3PM or the 2PM. Furthermore, although estimated trait scores under the various models correlate almost perfectly, inferences at the high and low ends of the trait continuum are compromised, with poorer coverage of the confidence intervals when the wrong model is used. We also show in an empirical example that the 4PM can yield new insights into the properties of a widely used delinquency scale. We discuss the implications for building appropriate measurement models in education and psychology to model more accurately the underlying response process.

  6. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model


    Oliveira, Arnaldo


    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  7. Spatially random models, estimation theory, and robot arm dynamics (United States)

    Rodriguez, G.


    Spatially random models provide an alternative to the more traditional deterministic models used to describe robot arm dynamics. These alternative models can be used to establish a relationship between the methodologies of estimation theory and robot dynamics. A new class of algorithms for many of the fundamental robotics problems of inverse and forward dynamics, inverse kinematics, etc. can be developed that use computations typical in estimation theory. The algorithms make extensive use of the difference equations of Kalman filtering and Bryson-Frazier smoothing to conduct spatial recursions. The spatially random models are very easy to describe and are based on the assumption that all of the inertial (D'Alembert) forces in the system are represented by a spatially distributed white-noise model. The models can also be used to generate numerically the composite multibody system inertia matrix. This is done without resorting to the more common methods of deterministic modeling involving Lagrangian dynamics, Newton-Euler equations, etc. These methods make substantial use of human knowledge in derivation and minipulation of equations of motion for complex mechanical systems.

  8. Nonlinear structural mechanics theory, dynamical phenomena and modeling

    CERN Document Server

    Lacarbonara, Walter


    Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...

  9. Modeling of active transmembrane transport in a mixture theory framework. (United States)

    Ateshian, Gerard A; Morrison, Barclay; Hung, Clark T


    This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature.

  10. Molecular Thermodynamic Modeling of Fluctuation Solution Theory Properties

    DEFF Research Database (Denmark)

    O’Connell, John P.; Abildskov, Jens


    Fluctuation Solution Theory provides relationships between integrals of the molecular pair total and direct correlation functions and the pressure derivative of solution density, partial molar volumes, and composition derivatives of activity coefficients. For dense fluids, the integrals follow...... a relatively simple corresponding-states behavior even for complex systems, show welldefined relationships for infinite dilution properties in complex and near-critical systems, allow estimation of mixed-solvent solubilities of gases and pharmaceuticals, and can be expressed by simple perturbation models...

  11. Building Better Ecological Machines: Complexity Theory and Alternative Economic Models

    Directory of Open Access Journals (Sweden)

    Jess Bier


    Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.

  12. Game Theory Models for Multi-Robot Patrolling of Infrastructures

    Directory of Open Access Journals (Sweden)

    Erik Hernández


    Full Text Available This work is focused on the problem of performing multi-robot patrolling for infrastructure security applications in order to protect a known environment at critical facilities. Thus, given a set of robots and a set of points of interest, the patrolling task consists of constantly visiting these points at irregular time intervals for security purposes. Current existing solutions for these types of applications are predictable and inflexible. Moreover, most of the previous work has tackled the patrolling problem with centralized and deterministic solutions and only few efforts have been made to integrate dynamic methods. Therefore, one of the main contributions of this work is the development of new dynamic and decentralized collaborative approaches in order to solve the aforementioned problem by implementing learning models from Game Theory. The model selected in this work that includes belief-based and reinforcement models as special cases is called Experience-Weighted Attraction. The problem has been defined using concepts of Graph Theory to represent the environment in order to work with such Game Theory techniques. Finally, the proposed methods have been evaluated experimentally by using a patrolling simulator. The results obtained have been compared with previous available approaches.

  13. Quantile hydrologic model selection and model structure deficiency assessment : 1. Theory

    NARCIS (Netherlands)

    Pande, S.


    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies

  14. Application of the evolution theory in modelling of innovation diffusion

    Directory of Open Access Journals (Sweden)

    Krstić Milan


    Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.

  15. Models of rational decision making in contemporary economic theory

    Directory of Open Access Journals (Sweden)

    Krstić Bojan


    Full Text Available The aim of this paper is to show that the economists can not adequately explain the rational behavior if are focused only on the scientific observations from the model of full rationality and the model instrumental rationality, and the inclusion related model makes 'larger views', which like more reprezentative reflection of the rational behavior represents a solid basis for construction the model of decision-making in contemporary economic science. Taking into account the goal of the work and its specific character, we composed adequate structure of work. In the first part of the paper, we define the model of full rationality, its important characteristics. In the second part, we analyze the model of instrumental rationality. In the analysis of model, we start from the statement, which given in economic theory, that the rational actor uses the best means to achieve their objectives. In the third part, we consider of the basic of the models of value rationality. In the fourth part, we consider key characteristics of the model of bounded rationality. In the last part, we focuse on questioning the basic assumptions of the model of full rationality and the model of instrumental rationality. We especially analyze the personal and social goals preferences of high school students and university students.

  16. CMB anomalies from an inflationary model in string theory

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhi-Guo; Piao, Yun-Song [University of Chinese Academy of Sciences, School of Physics, Beijing (China); Guo, Zong-Kuan [Chinese Academy of Sciences, State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, P.O. Box 2735, Beijing (China)


    Recent Planck measurements show some CMB anomalies on large angular scales, which confirms the early observations by WMAP. We show that an inflationary model, in which before the slow-roll inflation the Universe is in a superinflationary phase, can generate a large-scale cutoff in the primordial power spectrum, which may account for not only the power suppression on large angular scales, but also a large dipole power asymmetry in the CMB. We discuss an implementation of our model in string theory. (orig.)

  17. Theory and Circuit Model for Lossy Coaxial Transmission Line

    Energy Technology Data Exchange (ETDEWEB)

    Genoni, T. C.; Anderson, C. N.; Clark, R. E.; Gansz-Torres, J.; Rose, D. V.; Welch, Dale Robert


    The theory of signal propagation in lossy coaxial transmission lines is revisited and new approximate analytic formulas for the line impedance and attenuation are derived. The accuracy of these formulas from DC to 100 GHz is demonstrated by comparison to numerical solutions of the exact field equations. Based on this analysis, a new circuit model is described which accurately reproduces the line response over the entire frequency range. Circuit model calculations are in excellent agreement with the numerical and analytic results, and with finite-difference-time-domain simulations which resolve the skindepths of the conducting walls.

  18. Models and applications of chaos theory in modern sciences

    CERN Document Server

    Zeraoulia, Elhadj


    This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli

  19. Thick brane models in generalized theories of gravity

    Directory of Open Access Journals (Sweden)

    D. Bazeia


    Full Text Available This work deals with thick braneworld models, in an environment where the Ricci scalar is changed to accommodate the addition of two extra terms, one depending on the Ricci scalar itself, and the other, which takes into account the trace of the energy–momentum tensor of the scalar field that sources the braneworld scenario. We suppose that the scalar field engenders standard kinematics, and we show explicitly that the gravity sector of this new braneworld scenario is linearly stable. We illustrate the general results investigating two distinct models, focusing on how the brane profile is changed in the modified theories.

  20. Theories linguistiques, modeles informatiques, experimentation psycholinguistique (Linguistic Theories, Information-Processing Models, Psycholinguistic Experimentation) (United States)

    Dubois, Daniele


    Delineates and elaborates upon the underlying psychological postulates in linguistic and information-processing models, and shows the interdependence of psycholinguistics and linguistic analysis. (Text is in French.) (DB)

  1. Standard Model in multiscale theories and observational constraints (United States)

    Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David


    We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*35 MeV . For α0=1 /2 , the Lamb shift alone yields t*450 GeV .

  2. Future development of the research nuclear reactor IRT-2000 in Sofia

    Energy Technology Data Exchange (ETDEWEB)

    Apostolov, T.G. [Institute for Nuclear Research and Nuclear Energy, BAS, Sofia (Bulgaria)


    The present paper presents a short description of the research reactor IRT-2000 Sofia, started in 1961 and operated for 28 years. Some items are considered, connected to the improvements made in the contemporary safety requirements and the unrealized project for modernization to 5 MW. Proposals are considered for reconstruction of reactor site to a 'reactor of low power' for education purposes and as a basis for the country's nuclear technology development. (author)

  3. Quantifying 'problematic' DIF within an IRT framework: application to a cancer stigma index. (United States)

    Edelen, Maria Orlando; Stucky, Brian D; Chandra, Anita


    DIF detection within an IRT framework is highly powerful, often identifying significant DIF that is of little clinical importance. This paper introduces two metrics for IRT DIF evaluation that can discern potentially problematic DIF among items flagged with statistically significant DIF. Computation of two DIF metrics-(1) a weighted area between the expected score curves (wABC) and (2) a difference in expected a posteriori scores across item response categories (dEAP)-is described. Their use is demonstrated using data from a 27-item cancer stigma index fielded to four adult samples: (1) Arabic (N = 633) and (2) English speakers (N = 324) residing in Jordan and Egypt, and (3) English (N = 500) and (4) Mandarin speakers (N = 500) residing in China. We used IRTPRO's DIF module to calculate IRT-based Wald chi-square DIF statistics according to language within each region. After standard p value adjustments for multiple comparisons, we further evaluated DIF impact with wABC and dEAP. There were a total of twenty statistically significant DIF comparisons after p value adjustment. The wABCs for these items ranged from 0.13 to 0.90. Upon inspection of curves, DIF comparisons with wABCs >0.3 were deemed potentially problematic and were considered further for removal. The dEAP metric was also informative regarding impact of DIF on expected scores, but less consistently useful for narrowing down potentially problematic items. The calculations of wABC and dEAP function as DIF effect size indicators. Use of these metrics can substantially augment IRT DIF evaluation by discerning truly problematic DIF items among those with statistically significant DIF.

  4. The linear model and hypothesis a general unifying theory

    CERN Document Server

    Seber, George


    This book provides a concise and integrated overview of hypothesis testing in four important subject areas, namely linear and nonlinear models, multivariate analysis, and large sample theory. The approach used is a geometrical one based on the concept of projections and their associated idempotent matrices, thus largely avoiding the need to involve matrix ranks. It is shown that all the hypotheses encountered are either linear or asymptotically linear, and that all the underlying models used are either exactly or asymptotically linear normal models. This equivalence can be used, for example, to extend the concept of orthogonality in the analysis of variance to other models, and to show that the asymptotic equivalence of the likelihood ratio, Wald, and Score (Lagrange Multiplier) hypothesis tests generally applies.

  5. Visceral obesity and psychosocial stress: a generalised control theory model (United States)

    Wallace, Rodrick


    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  6. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)


    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  7. Adapting Structuration Theory as a Comprehensive Theory for Distance Education: The ASTIDE Model (United States)

    Aktaruzzaman, Md; Plunkett, Margaret


    Distance Education (DE) theorists have argued about the requirement for a theory to be comprehensive in a way that can explicate many of the activities associated with DE. Currently, Transactional Distance Theory (TDT) (Moore, 1993) and the Theory of Instructional Dialogue (IDT) (Caspi & Gorsky, 2006) are the most prominent theories, yet they…

  8. Dynamic IRT for the frescoes assessment: the study case of Danza Macabra in Clusone (Italy) (United States)

    Ludwig, Nicola; Rosina, Elisabetta


    IRT technique applications for the detection of the plasters defects in historic buildings are widely documented by scientific literature. Previous studies demonstrated the advantages of tomographic techniques to obtain quantitative results by IRT. With a quantitative approach, the dynamic measures of IRT, versus time and maximum value of thermal contrast, allows to locate the delamination and calculate its volume inside the thickness of the plaster. Nevertheless, effects are not prominent if compared with the ones caused by the interaction between surface and irradiation. Moreover these effects are lower than the noise due to the approximation of the spectral coefficient value. Authors already showed that the multispectral evaluation of the reflectance coefficients, in the range of visible and near IR, contributes to a proper evaluation of the thermograms shot on surfaces affected by chromatic alterations. In the study case, the evolution of the surface temperature in time allows to quantify the effects of spectral absorption (absorbance) in the thermograms. By comparing the thermograms to the maps of damages and intervention, it has been possible to correlate the materials and its state of conservation to the evolution of the thermal profile corresponding to each analyzed area.

  9. Plane answers to complex questions the theory of linear models

    CERN Document Server

    Christensen, Ronald


    This book was written to rigorously illustrate the practical application of the projective approach to linear models. To some, this may seem contradictory. I contend that it is possible to be both rigorous and illustrative and that it is possible to use the projective approach in practical applications. Therefore, unlike many other books on linear models, the use of projections and sub­ spaces does not stop after the general theory. They are used wherever I could figure out how to do it. Solving normal equations and using calculus (outside of maximum likelihood theory) are anathema to me. This is because I do not believe that they contribute to the understanding of linear models. I have similar feelings about the use of side conditions. Such topics are mentioned when appropriate and thenceforward avoided like the plague. On the other side of the coin, I just as strenuously reject teaching linear models with a coordinate free approach. Although Joe Eaton assures me that the issues in complicated problems freq...

  10. Comparison of Polytomous Parametric and Nonparametric Item Response Theory Models

    Directory of Open Access Journals (Sweden)



    Full Text Available This research aimed to identify the effects of independent variables as sample size, sample distribution, the number of items in the test, and the number of response categories of items in the test on the estimations of Graded Response Model (GRM under Parametric Item Response Theory (PIRT and by Monotone Homogeneity Model (MHM under Non-Parametric Item Response Theory (NIRT for polytomously scored items. To achieve this aim, the research was performed as a fundamental study in which 192 simulation conditions were designed by the combination of sample size, sample distribution, the number of items, and the number of categories of items. Estimates by GRM and MHM were examined under different levels of sample size (N= 100, 250, 500, 1000, sample distribution (normal, skewed, the number of items (10, 20, 40, 80, and the number of categories of items (3, 5, 7 conditions, by respectively calculating model-data fit, reliability values, standart errors of parameters. As a result of the research, it was found that since the values used to evaluate model-data fit were influenced by the increase of variable while calculating model-data fit and since they can not be interpreted alone, it is difficult to compare and generalize the results. The practical calculation of model data fit, which can be interpreted without the need for another value, in MHM provides superiority over GRM. Another research result is that the reliability values give similar results for both models. The standard errors of the MHM parameter estimates is lower than the GRM estimates under small sample and few items conditions and the standard errors of the MHM parameter estimates are close to each other in all conditions.

  11. An Investigation of Methods for Reducing Sampling Error in Certain IRT (Item Response Theory) Procedures. (United States)


    CNO - OP115 Utilization Navy Annex IIQ, Marine Corps (MIPU) Arlington, VA 20370 BCB, Building 2009 Quantico, VA 22134 1 11r. Brad Sympson Naval...Sector Dr. V. R. R. Uppuluri 1 Dr. Rand R. Wilcox Union Carbide Corporation University of Southern California Nuclear Division Department of

  12. Theory, modelling and simulation in origins of life studies. (United States)

    Coveney, Peter V; Swadling, Jacob B; Wattis, Jonathan A D; Greenwell, H Christopher


    Origins of life studies represent an exciting and highly multidisciplinary research field. In this review we focus on the contributions made by theory, modelling and simulation to addressing fundamental issues in the domain and the advances these approaches have helped to make in the field. Theoretical approaches will continue to make a major impact at the "systems chemistry" level based on the analysis of the remarkable properties of nonlinear catalytic chemical reaction networks, which arise due to the auto-catalytic and cross-catalytic nature of so many of the putative processes associated with self-replication and self-reproduction. In this way, we describe inter alia nonlinear kinetic models of RNA replication within a primordial Darwinian soup, the origins of homochirality and homochiral polymerization. We then discuss state-of-the-art computationally-based molecular modelling techniques that are currently being deployed to investigate various scenarios relevant to the origins of life.

  13. Structure and asymptotic theory for nonlinear models with GARCH errors

    Directory of Open Access Journals (Sweden)

    Felix Chan


    Full Text Available Nonlinear time series models, especially those with regime-switching and/or conditionally heteroskedastic errors, have become increasingly popular in the economics and finance literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the processes or the associated asymptotic theory. In this paper, we derive sufficient conditions for strict stationarity and ergodicity of three different specifications of the first-order smooth transition autoregressions with heteroskedastic errors. This is essential, among other reasons, to establish the conditions under which the traditional LM linearity tests based on Taylor expansions are valid. We also provide sufficient conditions for consistency and asymptotic normality of the Quasi-Maximum Likelihood Estimator for a general nonlinear conditional mean model with first-order GARCH errors.

  14. Dynamic statistical models of biological cognition: insights from communications theory (United States)

    Wallace, Rodrick


    Maturana's cognitive perspective on the living state, Dretske's insight on how information theory constrains cognition, the Atlan/Cohen cognitive paradigm, and models of intelligence without representation, permit construction of a spectrum of dynamic necessary conditions statistical models of signal transduction, regulation, and metabolism at and across the many scales and levels of organisation of an organism and its context. Nonequilibrium critical phenomena analogous to physical phase transitions, driven by crosstalk, will be ubiquitous, representing not only signal switching, but the recruitment of underlying cognitive modules into tunable dynamic coalitions that address changing patterns of need and opportunity at all scales and levels of organisation. The models proposed here, while certainly providing much conceptual insight, should be most useful in the analysis of empirical data, much as are fitted regression equations.

  15. A queueing theory based model for business continuity in hospitals. (United States)

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R


    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.

  16. Modelling apical constriction in epithelia using elastic shell theory. (United States)

    Jones, Gareth Wyn; Chapman, S Jonathan


    Apical constriction is one of the fundamental mechanisms by which embryonic tissue is deformed, giving rise to the shape and form of the fully-developed organism. The mechanism involves a contraction of fibres embedded in the apical side of epithelial tissues, leading to an invagination or folding of the cell sheet. In this article the phenomenon is modelled mechanically by describing the epithelial sheet as an elastic shell, which contains a surface representing the continuous mesh formed from the embedded fibres. Allowing this mesh to contract, an enhanced shell theory is developed in which the stiffness and bending tensors of the shell are modified to include the fibres' stiffness, and in which the active effects of the contraction appear as body forces in the shell equilibrium equations. Numerical examples are presented at the end, including the bending of a plate and a cylindrical shell (modelling neurulation) and the invagination of a spherical shell (modelling simple gastrulation).

  17. Lattice Gauge Theories Within and Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Gelzer, Zechariah John [Iowa U.


    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \

  18. Density Functional Theory and Materials Modeling at Atomistic Length Scales

    Directory of Open Access Journals (Sweden)

    Swapan K. Ghosh


    Full Text Available Abstract: We discuss the basic concepts of density functional theory (DFT as applied to materials modeling in the microscopic, mesoscopic and macroscopic length scales. The picture that emerges is that of a single unified framework for the study of both quantum and classical systems. While for quantum DFT, the central equation is a one-particle Schrodinger-like Kohn-Sham equation, the classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential, the exact functional form of which is unknown. One therefore approximates the exchange-correlation potential for quantum systems and the excess free energy density functional or the direct correlation functions for classical systems. Illustrative applications of quantum DFT to microscopic modeling of molecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are highlighted.

  19. Criticism of the Classical Theory of Macroeconomic Modeling

    Directory of Open Access Journals (Sweden)

    Konstantin K. Kumehov


    Full Text Available Abstract: Current approaches and methods of modeling of macroeconomic systems do not allow to generate research ideas that could be used in applications. This is largely due to the fact that the dominant economic schools and research directions are building their theories on misconceptions about the economic system as object modeling, and have no common methodological approaches in the design of macroeconomic models. All of them are focused on building a model aimed at establishing equilibrium parameters of supply and demand, production and consumption. At the same time as the underlying factors are not considered resource potential and the needs of society in material and other benefits. In addition, there is no unity in the choice of elements and mechanisms of interaction between them. Not installed, what are the criteria to determine the elements of the model: whether it is the institutions, whether the industry is whether the population, or banks, or classes, etc. From the methodological point of view, the design of the model all the most well-known authors extrapolated to the new models of the past state or past events. As a result, every time the model is ready by the time the situation changes, the last parameters underlying the model are losing relevance, so at best, the researcher may have to interpret the events and parameters that are not feasible in the future. In this paper, based on analysis of the works of famous authors, belonging to different schools and areas revealed weaknesses of their proposed macroeconomic models that do not allow you to use them to solve applied problems of economic development. A fundamentally new approaches and methods by which it is possible the construction of macroeconomic models that take into account the theoretical and applied aspects of modeling, as well as formulated the basic methodological requirements.


    Directory of Open Access Journals (Sweden)

    Fajrianthi Fajrianthi


    Full Text Available Penelitian ini bertujuan untuk menghasilkan sebuah alat ukur (tes berpikir kritis yang valid dan reliabel untuk digunakan, baik dalam lingkup pendidikan maupun kerja di Indonesia. Tahapan penelitian dilakukan berdasarkan tahap pengembangan tes menurut Hambleton dan Jones (1993. Kisi-kisi dan pembuatan butir didasarkan pada konsep dalam tes Watson-Glaser Critical Thinking Appraisal (WGCTA. Pada WGCTA, berpikir kritis terdiri dari lima dimensi yaitu Inference, Recognition Assumption, Deduction, Interpretation dan Evaluation of arguments. Uji coba tes dilakukan pada 1.453 peserta tes seleksi karyawan di Surabaya, Gresik, Tuban, Bojonegoro, Rembang. Data dikotomi dianalisis dengan menggunakan model IRT dengan dua parameter yaitu daya beda dan tingkat kesulitan butir. Analisis dilakukan dengan menggunakan program statistik Mplus versi 6.11 Sebelum melakukan analisis dengan IRT, dilakukan pengujian asumsi yaitu uji unidimensionalitas, independensi lokal dan Item Characteristic Curve (ICC. Hasil analisis terhadap 68 butir menghasilkan 15 butir dengan daya beda yang cukup baik dan tingkat kesulitan butir yang berkisar antara –4 sampai dengan 2.448. Sedikitnya jumlah butir yang berkualitas baik disebabkan oleh kelemahan dalam menentukan subject matter experts di bidang berpikir kritis dan pemilihan metode skoring. Kata kunci: Pengembangan tes, berpikir kritis, item response theory   DEVELOPING CRITICAL THINKING TEST UTILISING ITEM RESPONSE THEORY Abstract The present study was aimed to develop a valid and reliable instrument in assesing critical thinking which can be implemented both in educational and work settings in Indonesia. Following the Hambleton and Jones’s (1993 procedures on test development, the study developed the instrument by employing the concept of critical thinking from Watson-Glaser Critical Thinking Appraisal (WGCTA. The study included five dimensions of critical thinking as adopted from the WGCTA: Inference, Recognition

  1. Multiagent model and mean field theory of complex auction dynamics (United States)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng


    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  2. Effective Field Theory and the Gamow Shell Model


    Rotureau, J.; van Kolck, U.


    We combine Halo/Cluster Effective Field Theory (H/CEFT) and the Gamow Shell Model (GSM) to describe the $0^+$ ground state of $\\rm{^6He}$ as a three-body halo system. We use two-body interactions for the neutron-alpha particle and two-neutron pairs obtained from H/CEFT at leading order, with parameters determined from scattering in the p$_{3/2}$ and s$_0$ channels, respectively. The three-body dynamics of the system is solved using the GSM formalism, where the continuum states are incorporate...

  3. Mean-field theory and self-consistent dynamo modeling

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizawa, Akira; Yokoi, Nobumitsu [Tokyo Univ. (Japan). Inst. of Industrial Science; Itoh, Sanae-I [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka [National Inst. for Fusion Science, Toki, Gifu (Japan)


    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  4. Model for urban and indoor cellular propagation using percolation theory (United States)

    Franceschetti, G.; Marano, S.; Pasquino, N.; Pinto, I. M.


    A method for the analysis and statistical characterization of wave propagation in indoor and urban cellular radio channels is presented, based on a percolation model. Pertinent principles of the theory are briefly reviewed, and applied to the problem of interest. Relevant quantities, such as pulsed-signal arrival rate, number of reflections against obstacles, and path lengths are deduced and related to basic environment parameters such as obstacle density and transmitter-receiver separation. Results are found to be in good agreement with alternative simulations and measurements.

  5. Theory and Modeling of High-Power Gyrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Nusinovich, Gregory Semeon [Univ. of Maryland, College Park, MD (United States)


    This report summarized results of the work performed at the Institute for Research in Electronics and Applied Physics of the University of Maryland (College Park, MD) in the framework of the DOE Grant “Theory and Modeling of High-Power Gyrotrons”. The report covers the work performed in 2011-2014. The research work was performed in three directions: - possibilities of stable gyrotron operation in very high-order modes offering the output power exceeding 1 MW level in long-pulse/continuous-wave regimes, - effect of small imperfections in gyrotron fabrication and alignment on the gyrotron efficiency and operation, - some issues in physics of beam-wave interaction in gyrotrons.

  6. : The origins of the random walk model in financial theory


    Walter, Christian


    Ce texte constitue le chapitre 2 de l'ouvrage Le modèle de marche au hasard en finance, de Christian Walter, à paraître chez Economica, collection " Audit, assurance, actuariat ", en juin 2013. Il est publié ici avec l'accord de l'éditeur.; Three main concerns pave the way for the birth of the random walk model in financial theory: an ethical issue with Jules Regnault (1834-1894), a scientific issue with Louis Bachelier (1870-1946) and a pratical issue with Alfred Cowles (1891-1984). Three to...

  7. The Five-Factor Model and Self-Determination Theory

    DEFF Research Database (Denmark)

    Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette

    This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...... questionnaires of personality traits (NEO-FFI) and causality orientations (GCOS). To test whether covariance between traits and orientations could be attributed to shared or separate latent variables we conducted joint factor analyses. Results reveal that the Autonomy orientation can be distinguished from...

  8. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models. (United States)

    Oizumi, Ryo; Kuniya, Toshikazu; Enatsu, Yoichi


    Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  9. Rigorously testing multialternative decision field theory against random utility models. (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg


    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  10. Corvid re-caching without 'theory of mind': a model.

    Directory of Open Access Journals (Sweden)

    Elske van der Vaart

    Full Text Available Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.

  11. Measurement Bias across Gender on the Children's Depression Inventory: Evidence for Invariance from Two Latent Variable Models (United States)

    Carle, Adam C.; Millsap, Roger E.; Cole, David A.


    Confirmatory factor analysis for ordered-categorical measures (CFA-OCM) and rating scale item response theory (IRT) analyses explore measurement bias across gender on the Children's Depression Inventory (CDI) in a community sample of 779 children in the third and sixth grades. Given the set of statistical criteria, IRT and CFA-OCM generally…

  12. Fuzzy structure theory modeling of sound-insulation layers in complex vibroacoustic uncertain systems: theory and experimental validation. (United States)

    Fernandez, Charles; Soize, Christian; Gagliardini, Laurent


    The fuzzy structure theory was introduced 20 years ago in order to model the effects of complex subsystems imprecisely known on a master structure. This theory was only aimed at structural dynamics. In this paper, an extension of that theory is proposed in developing an elastoacoustic element useful to model sound-insulation layers for computational vibroacoustics of complex systems. The simplified model constructed enhances computation time and memory allocation because the number of physical and generalized degrees of freedom in the computational vibroacoustic model is not increased. However, these simplifications introduce model uncertainties. In order to take into account these uncertainties, the nonparametric probabilistic approach recently introduced is used. A robust simplified model for sound-insulation layers is then obtained. This model is controlled by a small number of physical and dispersion parameters. First, the extension of the fuzzy structure theory to elastoacoustic element is presented. Second, the computational vibroacoustic model including such an elastoacoustic element to model sound-insulation layer is given. Then, a design methodology to identify the model parameters with experiments is proposed and is experimentally validated. Finally, the theory is applied to an uncertain vibroacoustic system.

  13. Comparing theory-based condom interventions: health belief model versus theory of planned behavior. (United States)

    Montanaro, Erika A; Bryan, Angela D


    This study sought to experimentally manipulate the core constructs of the Health Belief Model (HBM) and the Theory of Planned Behavior (TPB) in order to compare the success of interventions to increase preparatory condom use behavior (i.e., purchasing condoms, talking to a boyfriend or girlfriend about using condoms, and carrying condoms) based on these theories. A total of 258 participants were randomly assigned to one of three computer-based interventions (HBM, TPB, or information-only control). A total of 204 (79.1%) completed follow-up assessments 1 month later. Regression analyses were conducted to determine which set of theoretical constructs accounted for the most variance in behavior at baseline. A series of structural equation models were estimated to determine which constructs were the "active ingredients" of change. The TPB accounted for 32.8% of the variance in risky sexual behavior at baseline, while the HBM only explained 1.6% of the variance. Mediational analyses revealed differential intervention effects on perceived susceptibility, perceived benefits, and attitudes toward condom use. However, it was attitudes toward condom use and condom use self-efficacy that were associated with intentions, which then predicted preparatory condom use behavior at follow-up. Except for attitudes, the mediators that were successfully manipulated by the interventions (i.e., perceived susceptibility, perceived severity, and attitudes) were not the same constructs that predicted intentions (i.e., attitudes and condom use self-efficacy), and subsequently predicted behavior. This suggests that the constructs that explain behavior are not the same as those that produce behavior change.

  14. Theory-guided exploration with structural equation model forests. (United States)

    Brandmaier, Andreas M; Prindle, John J; McArdle, John J; Lindenberger, Ulman


    Structural equation model (SEM) trees, a combination of SEMs and decision trees, have been proposed as a data-analytic tool for theory-guided exploration of empirical data. With respect to a hypothesized model of multivariate outcomes, such trees recursively find subgroups with similar patterns of observed data. SEM trees allow for the automatic selection of variables that predict differences across individuals in specific theoretical models, for instance, differences in latent factor profiles or developmental trajectories. However, SEM trees are unstable when small variations in the data can result in different trees. As a remedy, SEM forests, which are ensembles of SEM trees based on resamplings of the original dataset, provide increased stability. Because large forests are less suitable for visual inspection and interpretation, aggregate measures provide researchers with hints on how to improve their models: (a) variable importance is based on random permutations of the out-of-bag (OOB) samples of the individual trees and quantifies, for each variable, the average reduction of uncertainty about the model-predicted distribution; and (b) case proximity enables researchers to perform clustering and outlier detection. We provide an overview of SEM forests and illustrate their utility in the context of cross-sectional factor models of intelligence and episodic memory. We discuss benefits and limitations, and provide advice on how and when to use SEM trees and forests in future research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Exploring Bayesian model selection methods for effective field theory expansions (United States)

    Schaffner, Taylor; Yamauchi, Yukari; Furnstahl, Richard


    A fundamental understanding of the microscopic properties and interactions of nuclei has long evaded physicists due to the complex nature of quantum chromodynamics (QCD). One approach to modeling nuclear interactions is known as chiral effective field theory (EFT). Today, the method's greatest limitation lies in the approximation of interaction potentials and their corresponding uncertainties. Computing EFT expansion coefficients, known as Low-Energy Constants (LECs), from experimental data reduces to a problem of statistics and fitting. In the conventional approach, the fitting is done using frequentist methods that fail to evaluate the quality of the model itself (e.g., how many orders to use) in addition to its fit to the data. By utilizing Bayesian statistical methods for model selection, the model's quality can be taken into account, providing a more controlled and robust EFT expansion. My research involves probing different Bayesian model checking techniques to determine the most effective means for use with estimating the values of LECs. In particular, we are using model problems to explore the Bayesian calculation of an EFT expansion's evidence and an approximation to this value known as the WAIC (Widely Applicable Information Criterion). This work was supported in part by the National Science Foundation under Grant No. PHY-1306250.

  16. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models (United States)

    Funnell, Sue C.; Rogers, Patricia J.


    Between good intentions and great results lies a program theory--not just a list of tasks but a vision of what needs to happen, and how. Now widely used in government and not-for-profit organizations, program theory provides a coherent picture of how change occurs and how to improve performance. "Purposeful Program Theory" shows how to develop,…

  17. An Evolutionary Game Theory Model of Spontaneous Brain Functioning. (United States)

    Madeo, Dario; Talarico, Agostino; Pascual-Leone, Alvaro; Mocenni, Chiara; Santarnecchi, Emiliano


    Our brain is a complex system of interconnected regions spontaneously organized into distinct networks. The integration of information between and within these networks is a continuous process that can be observed even when the brain is at rest, i.e. not engaged in any particular task. Moreover, such spontaneous dynamics show predictive value over individual cognitive profile and constitute a potential marker in neurological and psychiatric conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN), able to capture brain's interregional dynamics by balancing emulative and non-emulative attitudes among brain regions. This results in the net behavior of nodes composing resting-state networks identified using functional magnetic resonance imaging (fMRI), determining their moment-to-moment level of activation and inhibition as expressed by positive and negative shifts in BOLD fMRI signal. By spontaneously generating low-frequency oscillatory behaviors, the EGN model is able to mimic functional connectivity dynamics, approximate fMRI time series on the basis of initial subset of available data, as well as simulate the impact of network lesions and provide evidence of compensation mechanisms across networks. Results suggest evolutionary game theory on networks as a new potential framework for the understanding of human brain network dynamics.

  18. Overexpression of ZmIRT1 and ZmZIP3 Enhances Iron and Zinc Accumulation in Transgenic Arabidopsis.

    Directory of Open Access Journals (Sweden)

    Suzhen Li

    Full Text Available Iron and zinc are important micronutrients for both the growth and nutrient availability of crop plants, and their absorption is tightly controlled by a metal uptake system. Zinc-regulated transporters, iron-regulated transporter-like proteins (ZIP, is considered an essential metal transporter for the acquisition of Fe and Zn in graminaceous plants. Several ZIPs have been identified in maize, although their physiological function remains unclear. In this report, ZmIRT1 was shown to be specifically expressed in silk and embryo, whereas ZmZIP3 was a leaf-specific gene. Both ZmIRT1 and ZmZIP3 were shown to be localized to the plasma membrane and endoplasmic reticulum. In addition, transgenic Arabidopsis plants overexpressing ZmIRT1 or ZmZIP3 were generated, and the metal contents in various tissues of transgenic and wild-type plants were examined based on ICP-OES and Zinpyr-1 staining. The Fe and Zn concentration increased in roots and seeds of ZmIRT1-overexpressing plants, while the Fe content in shoots decreased. Overexpressing ZmZIP3 enhanced Zn accumulation in the roots of transgenic plants, while that in shoots was repressed. In addition, the transgenic plants showed altered tolerance to various Fe and Zn conditions compared with wild-type plants. Furthermore, the genes associated with metal uptake were stimulated in ZmIRT1 transgenic plants, while those involved in intra- and inter- cellular translocation were suppressed. In conclusion, ZmIRT1 and ZmZIP3 are functional metal transporters with different ion selectivities. Ectopic overexpression of ZmIRT1 may stimulate endogenous Fe uptake mechanisms, which may facilitate metal uptake and homeostasis. Our results increase our understanding of the functions of ZIP family transporters in maize.

  19. Gibberellic acid alleviates cadmium toxicity by reducing nitric oxide accumulation and expression of IRT1 in Arabidopsis thaliana

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xiao Fang [State Key Laboratory of Plant Physiology and Biochemistry, College of Life Sciences, Zhejiang University, Hangzhou 310058 (China); Jiang, Tao [Key Laboratory of Conservation Biology for Endangered Wildlife of the Ministry of Education, College of Life Sciences, Zhejiang University, Hangzhou 310058 (China); Wang, Zhi Wei [State Key Laboratory of Plant Physiology and Biochemistry, College of Life Sciences, Zhejiang University, Hangzhou 310058 (China); Lei, Gui Jie [Key Laboratory of Conservation Biology for Endangered Wildlife of the Ministry of Education, College of Life Sciences, Zhejiang University, Hangzhou 310058 (China); Shi, Yuan Zhi [The Key Laboratory of Tea Chemical Engineering, Ministry of Agriculture, Yunqi Road 1, Hangzhou 310008 (China); Li, Gui Xin, E-mail: [College of Agronomy and Biotechnology, Zhejiang University, Hangzhou 310058 (China); Zheng, Shao Jian [State Key Laboratory of Plant Physiology and Biochemistry, College of Life Sciences, Zhejiang University, Hangzhou 310058 (China); Key Laboratory of Conservation Biology for Endangered Wildlife of the Ministry of Education, College of Life Sciences, Zhejiang University, Hangzhou 310058 (China)


    Highlights: Black-Right-Pointing-Pointer Cd reduces endogenous GA levels in Arabidopsis. Black-Right-Pointing-Pointer GA exogenous applied decreases Cd accumulation in plant. Black-Right-Pointing-Pointer GA suppresses the Cd-induced accumulation of NO. Black-Right-Pointing-Pointer Decreased NO level downregulates the expression of IRT1. Black-Right-Pointing-Pointer Suppressed IRT1 expression reduces Cd transport across plasma membrane. - Abstract: Gibberellic acid (GA) is involved in not only plant growth and development but also plant responses to abiotic stresses. Here it was found that treating the plants with GA concentrations from 0.1 to 5 {mu}M for 24 h had no obvious effect on root elongation in the absence of cadmium (Cd), whereas in the presence of Cd{sup 2+}, GA at 5 {mu}M improved root growth, reduced Cd content and lipid peroxidation in the roots, indicating that GA can partially alleviate Cd toxicity. Cd{sup 2+} increased nitric oxide (NO) accumulation in the roots, but GA remarkably reduced it, and suppressed the up-regulation of the expression of IRT1. In contrary, the beneficial effect of GA on alleviating Cd toxicity was not observed in an IRT1 knock-out mutant irt1, suggesting the involvement of IRT1 in Cd{sup 2+} absorption. Furthermore, the GA-induced reduction of NO and Cd content can also be partially reversed by the application of a NO donor (S-nitrosoglutathione [GSNO]). Taken all these together, the results showed that GA-alleviated Cd toxicity is mediated through the reduction of the Cd-dependent NO accumulation and expression of Cd{sup 2+} uptake related gene-IRT1 in Arabidopsis.

  20. Stellar rotation periods determined from simultaneously measured Ca II H&K and Ca II IRT lines (United States)

    Mittag, M.; Hempelmann, A.; Schmitt, J. H. M. M.; Fuhrmeister, B.; González-Pérez, J. N.; Schröder, K.-P.


    Aims: Previous studies have shown that, for late-type stars, activity indicators derived from the Ca II infrared-triplet (IRT) lines are correlated with the indicators derived from the Ca II H&K lines. Therefore, the Ca II IRT lines are in principle usable for activity studies, but they may be less sensitive when measuring the rotation period. Our goal is to determine whether the Ca II IRT lines are sufficiently sensitive to measure rotation periods and how any Ca II IRT derived rotation periods compare with periods derived from the "classical" Mount Wilson S-index. Methods: To analyse the Ca II IRT lines' sensitivity and to measure rotation periods, we define an activity index for each of the Ca II IRT lines similar to the Mount Wilson S-index and perform a period analysis for the lines separately and jointly. Results: For eleven late-type stars we can measure the rotation periods using the Ca II IRT indices similar to those found in the Mount Wilson S-index time series and find that a period derived from all four indices gives the most probable rotation period; we find good agreement for stars with already existing literature values. In a few cases the computed periodograms show a complicated structure with multiple peaks, meaning that formally different periods are derived in different indices. We show that in one case, this is due to data sampling effects and argue that denser cadence sampling is necessary to provide credible evidence for differential rotation. However, our TIGRE data for HD 101501 shows good evidence for the presence of differential rotation.

  1. The Effect of Differential Motivation on IRT Linking (United States)

    Mittelhaëuser, Marie-Anne; Béguin, Anton A.; Sijtsma, Klaas


    The purpose of this study was to investigate whether simulated differential motivation between the stakes for operational tests and anchor items produces an invalid linking result if the Rasch model is used to link the operational tests. This was done for an external anchor design and a variation of a pretest design. The study also investigated…

  2. Defining Deficient Items by IRT Analysis of Calibration Data. (United States)

    Krass, Iosif A.; Thomasson, Gary L.

    New items are being calibrated for the next generation of the computerized adaptive (CAT) version of the Armed Services Vocational Aptitude Battery (ASVAB) (Forms 5 and 6). The requirements that the items be "good" three-parameter logistic (3-PL) model items and typically "like" items in the previous CAT-ASVAB tests have…

  3. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    May R. D.


    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  4. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    Cahill R. T.


    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g-anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  5. Modeling of plucking piezoelectric energy harvesters with contact theory (United States)

    Fu, Xinlei; Liao, Wei-Hsin


    Non-harmonic excitations are widely available in the environment of our daily life. We could make use of these excitations to pluck piezoelectric energy harvesters. Plucking piezoelectric energy harvesting could overcome the frequency gap and achieve frequency-up effect. However, there has not been a thorough analysis on plucking piezoelectric energy harvesting, especially with good understanding on the plucking mechanism. This paper is aimed to develop a model to investigate the plucking mechanism and predict the responses of plucking piezoelectric energy harvesters under different kinds of excitations. In the electromechanical model, Hertzian contact theory is applied to account for the interaction between the plectrum and piezoelectric beam. The plucking mechanism is clarified as a cantilever beam impacted by an infinitely heavy mass, in which the multi-impact process prematurely terminates at separation time. We numerically predict the plucking force, which depends on piezoelectric beam, Hertzian contact stiffness, overlap area and plucking velocity. The energy distribution is investigated with connected resistor.

  6. Nature, theory and modelling of geophysical convective planetary boundary layers (United States)

    Zilitinkevich, Sergej


    Geophysical convective planetary boundary layers (CPBLs) are still poorly reproduced in oceanographic, hydrological and meteorological models. Besides the mean flow and usual shear-generated turbulence, CPBLs involve two types of motion disregarded in conventional theories: 'anarchy turbulence' comprised of the buoyancy-driven plumes, merging to form larger plumes instead of breaking down, as postulated in conventional theory (Zilitinkevich, 1973), large-scale organised structures fed by the potential energy of unstable stratification through inverse energy transfer in convective turbulence (and performing non-local transports irrespective of mean gradients of transporting properties). C-PBLs are strongly mixed and go on growing as long as the boundary layer remains unstable. Penetration of the mixed layer into the weakly turbulent, stably stratified free flow causes turbulent transports through the CPBL outer boundary. The proposed theory, taking into account the above listed features of CPBL, is based on the following recent developments: prognostic CPBL-depth equation in combination with diagnostic algorithm for turbulence fluxes at the CPBL inner and outer boundaries (Zilitinkevich, 1991, 2012, 2013; Zilitinkevich et al., 2006, 2012), deterministic model of self-organised convective structures combined with statistical turbulence-closure model of turbulence in the CPBL core (Zilitinkevich, 2013). It is demonstrated that the overall vertical transports are performed mostly by turbulence in the surface layer and entrainment layer (at the CPBL inner and outer boundaries) and mostly by organised structures in the CPBL core (Hellsten and Zilitinkevich, 2013). Principal difference between structural and turbulent mixing plays an important role in a number of practical problems: transport and dispersion of admixtures, microphysics of fogs and clouds, etc. The surface-layer turbulence in atmospheric and marine CPBLs is strongly enhanced by the velocity shears in

  7. A Proof Theory for Model Checking: An Extended Abstract

    Directory of Open Access Journals (Sweden)

    Quentin Heath


    Full Text Available While model checking has often been considered as a practical alternative to building formal proofs, we argue here that the theory of sequent calculus proofs can be used to provide an appealing foundation for model checking. Since the emphasis of model checking is on establishing the truth of a property in a model, we rely on the proof theoretic notion of additive inference rules, since such rules allow provability to directly describe truth conditions. Unfortunately, the additive treatment of quantifiers requires inference rules to have infinite sets of premises and the additive treatment of model descriptions provides no natural notion of state exploration. By employing a focused proof system, it is possible to construct large scale, synthetic rules that also qualify as additive but contain elements of multiplicative inference. These additive synthetic rules—essentially rules built from the description of a model—allow a direct treatment of state exploration. This proof theoretic framework provides a natural treatment of reachability and non-reachability problems, as well as tabled deduction, bisimulation, and winning strategies.

  8. A Systems Model of Parkinson's Disease Using Biochemical Systems Theory. (United States)

    Sasidharakurup, Hemalatha; Melethadathil, Nidheesh; Nair, Bipin; Diwakar, Shyam


    Parkinson's disease (PD), a neurodegenerative disorder, affects millions of people and has gained attention because of its clinical roles affecting behaviors related to motor and nonmotor symptoms. Although studies on PD from various aspects are becoming popular, few rely on predictive systems modeling approaches. Using Biochemical Systems Theory (BST), this article attempts to model and characterize dopaminergic cell death and understand pathophysiology of progression of PD. PD pathways were modeled using stochastic differential equations incorporating law of mass action, and initial concentrations for the modeled proteins were obtained from literature. Simulations suggest that dopamine levels were reduced significantly due to an increase in dopaminergic quinones and 3,4-dihydroxyphenylacetaldehyde (DOPAL) relating to imbalances compared to control during PD progression. Associating to clinically observed PD-related cell death, simulations show abnormal parkin and reactive oxygen species levels with an increase in neurofibrillary tangles. While relating molecular mechanistic roles, the BST modeling helps predicting dopaminergic cell death processes involved in the progression of PD and provides a predictive understanding of neuronal dysfunction for translational neuroscience.

  9. Theory and modeling of cylindrical thermo-acoustic transduction

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Lihong, E-mail: [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China); Lim, C.W. [Department of Architecture and Civil Engineering, City University of Hong Kong, Kowloon, Hong Kong SAR (China); Zhao, Xiushao; Geng, Daxing [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China)


    Models both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed and the corresponding acoustic pressure solutions are obtained. The acoustic pressure for an individual carbon nanotube (CNT) as a function of input power is investigated analytically and it is verified by comparing with the published experimental data. Further numerical analysis on the acoustic pressure response and characteristics for varying input frequency and distance are also examined both for solid and thinfilm-solid cylindrical thermo-acoustic transductions. Through detailed theoretical and numerical studies on the acoustic pressure solution for thinfilm-solid cylindrical transduction, it is concluded that a solid with smaller thermal conductivity favors to improve the acoustic performance. In general, the proposed models are applicable to a variety of cylindrical thermo-acoustic devices performing in different gaseous media. - Highlights: • Theory and modeling both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed. • The modeling is verified by comparing with the published experimental data. • Acoustic response characteristics of cylindrical thermo-acoustic transductions are predicted by the proposed model.

  10. A mixed-binomial model for Likert-type personality measures

    Directory of Open Access Journals (Sweden)

    Jüri eAllik


    Full Text Available Personality measurement is based on the idea that values on an unobservable latent variable determine the distribution of answers on a manifest response scale. Typically, it is assumed in the Item Response Theory (IRT that latent variables are related to the observed responses through continuous normal or logistic functions, determining the probability with which one of the ordered response alternatives on a Likert-scale item is chosen. Based on an analysis of 1,731 self- and other-rated responses on the 240 NEO PI-3 questionnaire items, it was proposed that a viable alternative is a finite number of latent events which are related to manifest responses through a binomial function which has only one parameter – the probability with which a given statement is approved. For the majority of items, the best fit was obtained with a mixed-binomial distribution, which assumes two different subpopulations who endorse items with two different probabilities. It was shown that the fit of the binomial IRT model can be improved by assuming that about 10% of random noise is contained in the answers and by taking into account response biases towards one of the response categories. It was concluded that the binomial response model for the measurement of personality traits may be a workable alternative to the more habitual normal and logistic IRT models.

  11. Stakeholder Theory and Value Creation Models in Brazilian Firms

    Directory of Open Access Journals (Sweden)

    Natalia Giugni Vidal


    Full Text Available Objective – The purpose of this study is to understand how top Brazilian firms think about and communicate value creation to their stakeholders. Design/methodology/approach – We use qualitative content analysis methodology to analyze the sustainability or annual integrated reports of the top 25 Brazilian firms by sales revenue. Findings – Based on our analysis, these firms were classified into three main types of stakeholder value creation models: narrow, broad, or transitioning from narrow to broad. We find that many of the firms in our sample are in a transition state between narrow and broad stakeholder value creation models. We also identify seven areas of concentration discussed by firms in creating value for stakeholders: better stakeholder relationships, better work environment, environmental preservation, increased customer base, local development, reputation, and stakeholder dialogue. Practical implications – This study shows a trend towards broader stakeholder value creation models in Brazilian firms. The findings of this study may inform practitioners interested in broadening their value creation models. Originality/value – This study adds to the discussion of stakeholder theory in the Brazilian context by understanding variations in value creation orientation in Brazil.

  12. Lorentz Violation of the Photon Sector in Field Theory Models

    Directory of Open Access Journals (Sweden)

    Lingli Zhou


    Full Text Available We compare the Lorentz violation terms of the pure photon sector between two field theory models, namely, the minimal standard model extension (SME and the standard model supplement (SMS. From the requirement of the identity of the intersection for the two models, we find that the free photon sector of the SMS can be a subset of the photon sector of the minimal SME. We not only obtain some relations between the SME parameters but also get some constraints on the SMS parameters from the SME parameters. The CPT-odd coefficients (kAFα of the SME are predicted to be zero. There are 15 degrees of freedom in the Lorentz violation matrix Δαβ of free photons of the SMS related with the same number of degrees of freedom in the tensor coefficients (kFαβμν, which are independent from each other in the minimal SME but are interrelated in the intersection of the SMS and the minimal SME. With the related degrees of freedom, we obtain the conservative constraints (2σ on the elements of the photon Lorentz violation matrix. The detailed structure of the photon Lorentz violation matrix suggests some applications to the Lorentz violation experiments for photons.

  13. Towards viable cosmological models of disformal theories of gravity (United States)

    Sakstein, Jeremy


    The late-time cosmological dynamics of disformal gravity are investigated using dynamical systems methods. It is shown that in the general case there are no stable attractors that screen fifth forces locally and simultaneously describe a dark energy dominated universe. Viable scenarios have late-time properties that are independent of the disformal parameters and are identical to the equivalent conformal quintessence model. Our analysis reveals that configurations where the Jordan frame metric becomes singular are only reached in the infinite future, thus explaining the natural pathology resistance observed numerically by several previous works. The viability of models where this can happen is discussed in terms of both the cosmological dynamics and local phenomena. We identify a special parameter tuning such that there is a new fixed point that can match the presently observed dark energy density and equation of state. This model is unviable when the scalar couples to the visible sector but may provide a good candidate model for theories where only dark matter is disformally coupled.


    Directory of Open Access Journals (Sweden)

    Ilkhom SHARIPOV


    Full Text Available One of the most important aspects of human development is the ability to have a decent standard of living. The secret of the "economic miracle" of many countries that have high standard of living, in fact, is simple and quite obvious. All these countries are characterized by high and sustained development of national economy, low unemployed population rate, growth of income and consumption. There is no doubt that economic growth leads to an increase in the wealth of the country as a whole, extending its potential in the fight against poverty, unemployment and solving other social problems. That is why a high level of economic growth is one of the main targets of economic policy in many countries around the world. This brief literature review discusses main existing theories and models of economic growth, its endogenous and exogenous aspects. The main purpose of this paper is to determine the current state of development of the economic growth theories and to determine their future directions of development.

  15. Further Simplification of the Simple Erosion Narrowing Score With Item Response Theory Methodology. (United States)

    Oude Voshaar, Martijn A H; Schenk, Olga; Ten Klooster, Peter M; Vonkeman, Harald E; Bernelot Moens, Hein J; Boers, Maarten; van de Laar, Mart A F J


    To further simplify the simple erosion narrowing score (SENS) by removing scored areas that contribute the least to its measurement precision according to analysis based on item response theory (IRT) and to compare the measurement performance of the simplified version to the original. Baseline and 18-month data of the Combinatietherapie Bij Reumatoide Artritis (COBRA) trial were modeled using longitudinal IRT methodology. Measurement precision was evaluated across different levels of structural damage. SENS was further simplified by omitting the least reliably scored areas. Discriminant validity of SENS and its simplification were studied by comparing their ability to differentiate between the COBRA and sulfasalazine arms. Responsiveness was studied by comparing standardized change scores between versions. SENS data showed good fit to the IRT model. Carpal and feet joints contributed the least statistical information to both erosion and joint space narrowing scores. Omitting the joints of the foot reduced measurement precision for the erosion score in cases with below-average levels of structural damage (relative efficiency compared with the original version ranged 35-59%). Omitting the carpal joints had minimal effect on precision (relative efficiency range 77-88%). Responsiveness of a simplified SENS without carpal joints closely approximated the original version (i.e., all Δ standardized change scores were ≤0.06). Discriminant validity was also similar between versions for both the erosion score (relative efficiency = 97%) and the SENS total score (relative efficiency = 84%). Our results show that the carpal joints may be omitted from the SENS without notable repercussion for its measurement performance. © 2016, American College of Rheumatology.

  16. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection (United States)

    Reason, Robert D.; Kimball, Ezekiel W.


    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  17. Behavioral and Social Sciences Theories and Models: Are They Used in Unintentional Injury Prevention Research? (United States)

    Trifiletti, L. B.; Gielen, A. C.; Sleet, D. A.; Hopkins, K.


    Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury…

  18. A brief history of string theory from dual models to M-theory

    CERN Document Server

    Rickles, Dean


    During its forty year lifespan, string theory has always had the power to divide, being called both a 'theory of everything' and a 'theory of nothing'. Critics have even questioned whether it qualifies as a scientific theory at all. This book adopts an objective stance, standing back from the question of the truth or falsity of string theory and instead focusing on how it came to be and how it came to occupy its present position in physics. An unexpectedly rich history is revealed, with deep connections to our most well-established physical theories. Fully self-contained and written in a lively fashion, the book will appeal to a wide variety of readers from novice to specialist.

  19. A study of the dimensionality and measurement precision of the SCL-90-R using item response theory. (United States)

    Paap, Muirne C S; Meijer, Rob R; Van Bebber, Jan; Pedersen, Geir; Karterud, Sigmund; Hellem, Frøydis M; Haraldsen, Ira R


    We used item response theory (IRT) to (a) investigate the dimensionality of the Symptom Checklist-90-Revised (SCL-90-R) in a severely disturbed patient group, (b) improve the subscales in a meaningful way and (c) investigate the measurement precision of the improved scales. The total sample comprised 3078 patients (72% women, mean age=35±9) admitted to 14 different day hospitals participating in the Norwegian Network of Personality-focused Treatment Programmes. Mokken Scale Analysis was used to investigate the dimensionality of the SCL-90-R and improve the subscales. This analysis was theory-driven: the scales were built on two start items that reflected the content of the disorder that corresponds with the specific scale. The Graded Response Model was employed to determine measurement precision. Our theory-driven IRT approach resulted in a new seven-factor solution including 60 of the 90 items clustered in seven scales: depression, agoraphobia, physical complaints, obsessive-compulsive, hostility (unchanged), distrust and psychoticism. Most of the new scales discriminated reliably between patients with moderately low scores to moderately high scores. In conclusion, we found support for the multidimensionality of the SCL-90-R in a large sample of severely disturbed patients. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    , it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers...

  1. Generalized Potts-Models and their Relevance for Gauge Theories

    Directory of Open Access Journals (Sweden)

    Andreas Wipf


    Full Text Available We study the Polyakov loop dynamics originating from finite-temperature Yang-Mills theory. The effective actions contain center-symmetric terms involving powers of the Polyakov loop, each with its own coupling. For a subclass with two couplings we perform a detailed analysis of the statistical mechanics involved. To this end we employ a modified mean field approximation and Monte Carlo simulations based on a novel cluster algorithm. We find excellent agreement of both approaches. The phase diagram exhibits both first and second order transitions between symmetric, ferromagnetic and antiferromagnetic phases with phase boundaries merging at three tricritical points. The critical exponents ν and γ at the continuous transition between symmetric and antiferromagnetic phases are the same as for the 3-state spin Potts model.

  2. Basic first-order model theory in Mizar

    Directory of Open Access Journals (Sweden)

    Marco Bright Caminati


    Full Text Available The author has submitted to Mizar Mathematical Library a series of five articles introducing a framework for the formalization of classical first-order model theory.In them, Goedel's completeness and Lowenheim-Skolem theorems have also been formalized for the countable case, to offer a first application of it and to showcase its utility.This is an overview and commentary on some key aspects of this setup.It features exposition and discussion of a new encoding of basic definitions and theoretical gears needed for the task, remarks about the design strategies and approaches adopted in their implementation, and more general reflections about proof checking induced by the work done.

  3. Modeling of tethered satellite formations using graph theory

    DEFF Research Database (Denmark)

    Larsen, Martin Birkelund; Smith, Roy S; Blanke, Mogens


    Tethered satellite formations have recently gained increasing attention due to future mission proposals. Several different formations have been investigated for their dynamic properties and control schemes have been suggested. Formulating the equations of motion and investigation which geometries...... satellite formation and proposes a method to deduce the equations of motion for the attitude dynamics of the formation in a compact form. The use of graph theory and Lagrange mechanics together allows a broad class of formations to be described using the same framework. A method is stated for finding...... could form stable formations in space are cumbersome when done at a case to case basis, and a common framework providing a basic model of the dynamics of tethered satellite formations can therefore be advantageous. This paper suggests the use of graph theoretical quantities to describe a tethered...

  4. Partial differential equations in action from modelling to theory

    CERN Document Server

    Salsa, Sandro


    The book is intended as an advanced undergraduate or first-year graduate course for students from various disciplines, including applied mathematics, physics and engineering. It has evolved from courses offered on partial differential equations (PDEs) over the last several years at the Politecnico di Milano. These courses had a twofold purpose: on the one hand, to teach students to appreciate the interplay between theory and modeling in problems arising in the applied sciences, and on the other to provide them with a solid theoretical background in numerical methods, such as finite elements. Accordingly, this textbook is divided into two parts. The first part, chapters 2 to 5, is more elementary in nature and focuses on developing and studying basic problems from the macro-areas of diffusion, propagation and transport, waves and vibrations. In turn the second part, chapters 6 to 11, concentrates on the development of Hilbert spaces methods for the variational formulation and the analysis of (mainly) linear bo...

  5. Partial differential equations in action from modelling to theory

    CERN Document Server

    Salsa, Sandro


    The book is intended as an advanced undergraduate or first-year graduate course for students from various disciplines, including applied mathematics, physics and engineering. It has evolved from courses offered on partial differential equations (PDEs) over the last several years at the Politecnico di Milano. These courses had a twofold purpose: on the one hand, to teach students to appreciate the interplay between theory and modeling in problems arising in the applied sciences, and on the other to provide them with a solid theoretical background in numerical methods, such as finite elements. Accordingly, this textbook is divided into two parts. The first part, chapters 2 to 5, is more elementary in nature and focuses on developing and studying basic problems from the macro-areas of diffusion, propagation and transport, waves and vibrations. In turn the second part, chapters 6 to 11, concentrates on the development of Hilbert spaces methods for the variational formulation and the analysis of (mainly) linear bo...

  6. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C


    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  7. Multimodal Transport System Coevolution Model Based on Synergetic Theory

    Directory of Open Access Journals (Sweden)

    Fenling Feng


    Full Text Available This study investigates multimodal transport system evolution law with the consideration of synergetic theory. Compared with previous studies, this paper focuses on understanding influencing factors of system collaborative development. In particular, we have applied a multimodal system order parameter model to obtain the order parameter. Based on order parameters, the coevolution equations of the multimodal transport system are constructed with consideration of cooperation and competitive relationship between the subsystems. We set out the multimodal system followed the coevolution law of the freight system and dominated by the combined effects of order parameter line length and freight density. The results show that the coordination effects between railway, road, and water subsystems are stronger than aviation subsystem; the railway system is the short plank of the system. Some functional implications from this study are also discussed. Finally the results indicate that expansion of railway system capacity and mutual cooperation within the subsystems are required to reach an optimal multimodal transport system.

  8. Theories and models on the biological of cells in space (United States)

    Todd, P.; Klaus, D. M.


    A wide variety of observations on cells in space, admittedly made under constraining and unnatural conditions in may cases, have led to experimental results that were surprising or unexpected. Reproducibility, freedom from artifacts, and plausibility must be considered in all cases, even when results are not surprising. The papers in symposium on 'Theories and Models on the Biology of Cells in Space' are dedicated to the subject of the plausibility of cellular responses to gravity -- inertial accelerations between 0 and 9.8 m/sq s and higher. The mechanical phenomena inside the cell, the gravitactic locomotion of single eukaryotic and prokaryotic cells, and the effects of inertial unloading on cellular physiology are addressed in theoretical and experimental studies.

  9. Probing flame chemistry with MBMS, theory, and modeling

    Energy Technology Data Exchange (ETDEWEB)

    Westmoreland, P.R. [Univ. of Massachusetts, Amherst (United States)


    The objective is to establish kinetics of combustion and molecular-weight growth in C{sub 3} hydrocarbon flames as part of an ongoing study of flame chemistry. Specific reactions being studied are (1) the growth reactions of C{sub 3}H{sub 5} and C{sub 3}H{sub 3} with themselves and with unsaturated hydrocarbons and (2) the oxidation reactions of O and OH with C{sub 3}`s. This approach combines molecular-beam mass spectrometry (MBMS) experiments on low-pressure flat flames; theoretical predictions of rate constants by thermochemical kinetics, Bimolecular Quantum-RRK, RRKM, and master-equation theory; and whole-flame modeling using full mechanisms of elementary reactions.

  10. Dynamic density functional theory of solid tumor growth: Preliminary models

    Directory of Open Access Journals (Sweden)

    Arnaud Chauviere


    Full Text Available Cancer is a disease that can be seen as a complex system whose dynamics and growth result from nonlinear processes coupled across wide ranges of spatio-temporal scales. The current mathematical modeling literature addresses issues at various scales but the development of theoretical methodologies capable of bridging gaps across scales needs further study. We present a new theoretical framework based on Dynamic Density Functional Theory (DDFT extended, for the first time, to the dynamics of living tissues by accounting for cell density correlations, different cell types, phenotypes and cell birth/death processes, in order to provide a biophysically consistent description of processes across the scales. We present an application of this approach to tumor growth.

  11. A Novel One-Dimensional Electronic State at IrTe2 Surface (United States)

    Ootsuki, Daiki; Ishii, Hiroyuki; Kudo, Kazutaka; Nohara, Minoru; Takahashi, Masaya; Horio, Masafumi; Fujimori, Atsushi; Yoshida, Teppei; Arita, Masashi; Anzai, Hiroaki; Namatame, Hirofumi; Taniguchi, Masaki; Saini, Naurang L.; Mizokawa, Takashi


    Highly one-dimensional (1D) Fermi sheets are realized at the surface of a layered Ir telluride IrTe2 which exhibits a stripe-type charge and orbital order below ˜280 K. The 1D Fermi sheets appear in the low temperature range where the stripe order is well established. The 1D Fermi sheets are truncated by the bulk Fermi surfaces, and the spectral weight suppression at the Fermi level deviates from the typical Tomonaga-Luttinger behavior. The 1D band runs along the stripe and is accompanied by several branches which can be derived from the quantization in the perpendicular direction.

  12. Development of a dynamic computational model of social cognitive theory

    National Research Council Canada - National Science Library

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C


    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors...

  13. SUSY Breaking in Local String/F-Theory Models

    CERN Document Server

    Blumenhagen, R; Krippendorf, S; Moster, S; Quevedo, F


    We investigate bulk moduli stabilisation and supersymmetry breaking in local string/F-theory models where the Standard Model is supported on a del Pezzo surface or singularity. Computing the gravity mediated soft terms on the Standard Model brane induced by bulk supersymmetry breaking in the LARGE volume scenario, we explicitly find suppressions by M_s/M_P ~ V^{-1/2} compared to M_{3/2}. This gives rise to several phenomenological scenarios, depending on the strength of perturbative corrections to the effective action and the source of de Sitter lifting, in which the soft terms are suppressed by at least M_P/V^{3/2} and may be as small as M_P/V^2. Since the gravitino mass is of order M_{3/2} ~ M_P/V, for TeV soft terms all these scenarios give a very heavy gravitino (M_{3/2} >= 10^8 GeV) and generically the lightest moduli field is also heavy enough (m >= 10 TeV) to avoid the cosmological moduli problem. For TeV soft terms, these scenarios predict a minimal value of the volume to be V ~ 10^{6-7} in string uni...

  14. Theory, Modeling and Simulation: Research progress report 1994--1995

    Energy Technology Data Exchange (ETDEWEB)

    Garrett, B.C.; Dixon, D.A.; Dunning, T.H.


    The Pacific Northwest National Laboratory (PNNL) has established the Environmental Molecular Sciences Laboratory (EMSL). In April 1994, construction began on the new EMSL, a collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation (TM and S) program will play a critical role in understanding molecular processes important in restoring DOE`s research, development, and production sites, including understanding the migration and reactions of contaminants in soils and ground water, developing processes for isolation and processing of pollutants, developing improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TM and S program are fivefold: to apply available electronic structure and dynamics techniques to study fundamental molecular processes involved in the chemistry of natural and contaminated systems; to extend current electronic structure and dynamics techniques to treat molecular systems of future importance and to develop new techniques for addressing problems that are computationally intractable at present; to apply available molecular modeling techniques to simulate molecular processes occurring in the multi-species, multi-phase systems characteristic of natural and polluted environments; to extend current molecular modeling techniques to treat ever more complex molecular systems and to improve the reliability and accuracy of such simulations; and to develop technologies for advanced parallel architectural computer systems. Research highlights of 82 projects are given.

  15. Optimization models using fuzzy sets and possibility theory

    CERN Document Server

    Orlovski, S


    Optimization is of central concern to a number of discip­ lines. Operations Research and Decision Theory are often consi­ dered to be identical with optimizationo But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i. e. the solutions were considered to be either fea­ sible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeller to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real prob­ lems. This is particularly true if the problem under considera­ tion includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if na­ tural language has to be...

  16. Local structural aspects of metal-metal transition in IrTe2 from x-ray PDF (United States)

    Yu, Runze; Abeykoon, Milinda; Zhou, Haidong; Yin, Weiguo; Bozin, Emil S.

    Evolution of local atomic structure across the metal-metal transition in IrTe2 is explored by pair distribution function (PDF) analysis of x-ray total scattering data over 80 K modeling displays hysteretic behavior across the transition, in agreement with electronic transport measurements, indicative of a strong tie between the lattice and electronic configurations. Bond valence methodology applied to structural parameters further indicates significant bond charge disproportionation in association with the transition. Work at Brookhaven National Laboratory was supported by US DOE, Office of Science, Office of Basic Energy Sciences (DOE-BES) under Contract No. DE-SC0012704.

  17. Effects of Initial Values and Convergence Criterion in the Two-Parameter Logistic Model When Estimating the Latent Distribution in BILOG-MG 3 (United States)

    Nader, Ingo W.; Tran, Ulrich S.; Voracek, Martin


    Parameters of the two-parameter logistic model are generally estimated via the expectation-maximization algorithm, which improves initial values for all parameters iteratively until convergence is reached. Effects of initial values are rarely discussed in item response theory (IRT), but initial values were recently found to affect item parameters when estimating the latent distribution with full non-parametric maximum likelihood. However, this method is rarely used in practice. Hence, the present study investigated effects of initial values on item parameter bias and on recovery of item characteristic curves in BILOG-MG 3, a widely used IRT software package. Results showed notable effects of initial values on item parameters. For tighter convergence criteria, effects of initial values decreased, but item parameter bias increased, and the recovery of the latent distribution worsened. For practical application, it is advised to use the BILOG default convergence criterion with appropriate initial values when estimating the latent distribution from data. PMID:26452264

  18. Attachment Theory and Theory of Planned Behavior: An Integrative Model Predicting Underage Drinking (United States)

    Lac, Andrew; Crano, William D.; Berger, Dale E.; Alvaro, Eusebio M.


    Research indicates that peer and maternal bonds play important but sometimes contrasting roles in the outcomes of children. Less is known about attachment bonds to these 2 reference groups in young adults. Using a sample of 351 participants (18 to 20 years of age), the research integrated two theoretical traditions: attachment theory and theory of…

  19. Eye growth and myopia development: Unifying theory and Matlab model. (United States)

    Hung, George K; Mahadas, Kausalendra; Mohammad, Faisal


    The aim of this article is to present an updated unifying theory of the mechanisms underlying eye growth and myopia development. A series of model simulation programs were developed to illustrate the mechanism of eye growth regulation and myopia development. Two fundamental processes are presumed to govern the relationship between physiological optics and eye growth: genetically pre-programmed signaling and blur feedback. Cornea/lens is considered to have only a genetically pre-programmed component, whereas eye growth is considered to have both a genetically pre-programmed and a blur feedback component. Moreover, based on the Incremental Retinal-Defocus Theory (IRDT), the rate of change of blur size provides the direction for blur-driven regulation. The various factors affecting eye growth are shown in 5 simulations: (1 - unregulated eye growth): blur feedback is rendered ineffective, as in the case of form deprivation, so there is only genetically pre-programmed eye growth, generally resulting in myopia; (2 - regulated eye growth): blur feedback regulation demonstrates the emmetropization process, with abnormally excessive or reduced eye growth leading to myopia and hyperopia, respectively; (3 - repeated near-far viewing): simulation of large-to-small change in blur size as seen in the accommodative stimulus/response function, and via IRDT as well as nearwork-induced transient myopia (NITM), leading to the development of myopia; (4 - neurochemical bulk flow and diffusion): release of dopamine from the inner plexiform layer of the retina, and the subsequent diffusion and relay of neurochemical cascade show that a decrease in dopamine results in a reduction of proteoglycan synthesis rate, which leads to myopia; (5 - Simulink model): model of genetically pre-programmed signaling and blur feedback components that allows for different input functions to simulate experimental manipulations that result in hyperopia, emmetropia, and myopia. These model simulation programs

  20. Theory and Low-Order Modeling of Unsteady Airfoil Flows (United States)

    Ramesh, Kiran

    Unsteady flow phenomena are prevalent in a wide range of problems in nature and engineering. These include, but are not limited to, aerodynamics of insect flight, dynamic stall in rotorcraft and wind turbines, leading-edge vortices in delta wings, micro-air vehicle (MAV) design, gust handling and flow control. The most significant characteristics of unsteady flows are rapid changes in the circulation of the airfoil, apparent-mass effects, flow separation and the leading-edge vortex (LEV) phenomenon. Although experimental techniques and computational fluid dynamics (CFD) methods have enabled the detailed study of unsteady flows and their underlying features, a reliable and inexpensive loworder method for fast prediction and for use in control and design is still required. In this research, a low-order methodology based on physical principles rather than empirical fitting is proposed. The objective of such an approach is to enable insights into unsteady phenomena while developing approaches to model them. The basis of the low-order model developed here is unsteady thin-airfoil theory. A time-stepping approach is used to solve for the vorticity on an airfoil camberline, allowing for large amplitudes and nonplanar wakes. On comparing lift coefficients from this method against data from CFD and experiments for some unsteady test cases, it is seen that the method predicts well so long as LEV formation does not occur and flow over the airfoil is attached. The formation of leading-edge vortices (LEVs) in unsteady flows is initiated by flow separation and the formation of a shear layer at the airfoil's leading edge. This phenomenon has been observed to have both detrimental (dynamic stall in helicopters) and beneficial (high-lift flight in insects) effects. To predict the formation of LEVs in unsteady flows, a Leading Edge Suction Parameter (LESP) is proposed. This parameter is calculated from inviscid theory and is a measure of the suction at the airfoil's leading edge. It

  1. Building dynamic models and theories to advance the science of symptom management research. (United States)

    Brant, Jeannine M; Beck, Susan; Miaskowski, Christine


    This paper is a description, comparison, and critique of two models and two theories used to guide symptom management research, and a proposal of directions for new theory or model development. Symptom management research has undergone a paradigmatic shift to include symptom clusters, longitudinal studies that examine symptom trajectories, and the effects of interventions on patient outcomes. Models and theories are used to guide descriptive and intervention research. Over the past 15 years, four conceptual models or theories (i.e. Theory of Symptom Management, the Theory of Unpleasant Symptoms, the Symptoms Experience Model and the Symptoms Experience in Time Model) were used in a variety of symptom management studies. Literature searches were performed in Medline and the Cumulative Index of Nursing and Allied Health Literature between 1990 and 2008 for models and theories that guide symptom management research. Related papers and book chapters were used as supporting documentation. Comparison and critique of the models and theories revealed important gaps including lack of consideration of symptom clusters, failure to incorporate temporal aspects of the symptom experience and failure to incorporate the impact of interventions on patient outcomes. New models and theories should incorporate current trends in symptom management research, capture the dynamic nature of symptoms and incorporate concepts that will facilitate transdisciplinary research in symptom management. Researchers and clinicians need to build more expansive and dynamic symptom management models and theories that parallel advances in symptom research and practice.

  2. Theory development for HIV behavioral health: empirical validation of behavior health models specific to HIV risk. (United States)

    Traube, Dorian E; Holloway, Ian W; Smith, Lana


    In the presence of numerous health behavior theories, it is difficult to determine which of the many theories is most precise in explaining health-related behavior. New models continue to be introduced to the field, despite already existing disparity, overlap, and lack of unification among health promotion theories. This paper will provide an overview of current arguments and frameworks for testing and developing a comprehensive set of health behavior theories. In addition, the authors make a unique contribution to the HIV health behavior theory literature by moving beyond current health behavior theory critiques to argue that one of the field's preexisting, but less popular theories, Social Action Theory (SAT), offers a pragmatic and broad framework to address many of the accuracy issues within HIV health behavior theory. The authors conclude this article by offering a comprehensive plan for validating model accuracy, variable influence, and behavioral applicability of SAT.

  3. Integrating Beck's cognitive model and the response style theory in an adolescent sample. (United States)

    Winkeljohn Black, Stephanie; Pössel, Patrick


    Depression becomes more prevalent as individuals progress from childhood to adulthood. Thus, empirically supported and popular cognitive vulnerability theories to explain depression in adulthood have begun to be tested in younger age groups, particularly adolescence, a time of significant cognitive development. Beck's cognitive theory and the response style theory are well known, empirically supported theories of depression. The current, two-wave longitudinal study (N = 462; mean age = 16.01 years; SD = 0.69; 63.9% female) tested various proposed integrative models of Beck's cognitive theory and the response style theory, as well as the original theories themselves, to determine if and how these cognitive vulnerabilities begin to intertwine in adolescence. Of the integrative models tested-all with structural equation modeling in AMOS 21-the best-fitting integrative model was a moderation model wherein schemata influenced rumination, and rumination then influenced other cognitive variables in Beck's model. Findings revealed that this integrated model fit the data better than the response style theory and explained 1.2% more variance in depressive symptoms. Additionally, multigroup analyses comparing the fit of the best-fitting integrated model across adolescents with clinical and subclinical depressive symptoms revealed that the model was not stable between these two subsamples. However, of the hypotheses relevant to the integrative model, only 1 of the 18 associations was significantly different between the clinical and subclinical samples. Regardless, the integrated model was not superior to the more parsimonious model from Beck's cognitive theory. Implications and limitations are discussed.

  4. A practitioner's guide to persuasion: an overview of 15 selected persuasion theories, models and frameworks. (United States)

    Cameron, Kenzie A


    To provide a brief overview of 15 selected persuasion theories and models, and to present examples of their use in health communication research. The theories are categorized as message effects models, attitude-behavior approaches, cognitive processing theories and models, consistency theories, inoculation theory, and functional approaches. As it is often the intent of a practitioner to shape, reinforce, or change a patient's behavior, familiarity with theories of persuasion may lead to the development of novel communication approaches with existing patients. This article serves as an introductory primer to theories of persuasion with applications to health communication research. Understanding key constructs and general formulations of persuasive theories may allow practitioners to employ useful theoretical frameworks when interacting with patients.

  5. Making sense of theory of mind and paranoia: the psychometric properties and reasoning requirements of a false belief sequencing task. (United States)

    Corcoran, Rhiannon; Bentall, Richard P; Rowse, Georgina; Moore, Rosanne; Cummins, Sinead; Blackwood, Nigel; Howard, Robert; Shryane, Nick M


    INTRODUCTION. This study used Item-Response Theory (IRT) to model the psychometric properties of a false belief picture sequencing task. Consistent with the mental time travel hypothesis of paranoia, we anticipated that performance on this deductive theory of mind (ToM) task would not be associated with the presence of persecutory delusions but would be related to other clinical, cognitive, and demographic factors. METHOD. A large (N=237) and diverse clinical and nonclinical sample differing in levels of depression and paranoid ideation performed 2 ToM tasks: the false belief sequencing task and a ToM stories task that was used to assess the validity of the false belief sequencing task as a measure of ToM. RESULTS. A unidimensional IRT model was found to fit the data well. Latent ToM ability as measured by the false belief sequencing task was negatively related with age and positively with IQ. In contrast to the ToM stories measure, there was no association between clinical diagnosis or symptoms and false belief picture sequencing after controlling for age and IQ. CONCLUSIONS. In line with mental time travel hypothesis of paranoia (Corcoran, 2010 ), performance on this deductive nonverbal ToM task is not related to the presence of paranoid symptoms. This measure is best suited for assessing ToM functioning where participants' performance falls just short of the average latent ToM ability. Furthermore, it is sensitive to the effects of increasing age and decreasing IQ.

  6. Cognitive performance modeling based on general systems performance theory. (United States)

    Kondraske, George V


    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  7. The physical theory and propagation model of THz atmospheric propagation (United States)

    Wang, R.; Yao, J. Q.; Xu, D. G.; Wang, J. L.; Wang, P.


    Terahertz (THz) radiation is extensively applied in diverse fields, such as space communication, Earth environment observation, atmosphere science, remote sensing and so on. And the research on propagation features of THz wave in the atmosphere becomes more and more important. This paper firstly illuminates the advantages and outlook of THz in space technology. Then it introduces the theoretical framework of THz atmospheric propagation, including some fundamental physical concepts and processes. The attenuation effect (especially the absorption of water vapor), the scattering of aerosol particles and the effect of turbulent flow mainly influence THz atmosphere propagation. Fundamental physical laws are illuminated as well, such as Lamber-beer law, Mie scattering theory and radiative transfer equation. The last part comprises the demonstration and comparison of THz atmosphere propagation models like Moliere(V5), SARTre and AMATERASU. The essential problems are the deep analysis of physical mechanism of this process, the construction of atmospheric propagation model and databases of every kind of material in the atmosphere, and the standardization of measurement procedures.

  8. Inspection of radiant heating floor applying non-destructive testing techniques: GPR and IRT

    Directory of Open Access Journals (Sweden)

    Susana Lagüela-López


    Full Text Available La inspección de suelos radiantes requiere el uso de técnicas no destructivas, tratando de minimizar el impacto de la inspección, así como el tiempo y el coste, además de maximizar la información adquirida de cara al mejor diagnóstico posible. Con este objetivo, la aplicación de termografía infrarroja (IRT y georradar (GPR se propone para la inspección de suelos radiantes con cobertura de diferentes materiales, para evaluar las capacidades y la información adquirible con cada técnica. Los resultados muestran que cada técnica proporciona diferentes tipos de información: estado de las tuberías (IRT, geometría y configuración (GPR; concluyendo que la inspección óptima está formada por la combinación de ambas técnicas.

  9. Measuring Student Involvement: A Comparison of Classical Test Theory and Item Response Theory in the Construction of Scales from Student Surveys (United States)

    Sharkness, Jessica; DeAngelo, Linda


    This study compares the psychometric utility of Classical Test Theory (CTT) and Item Response Theory (IRT) for scale construction with data from higher education student surveys. Using 2008 Your First College Year (YFCY) survey data from the Cooperative Institutional Research Program at the Higher Education Research Institute at UCLA, two scales…

  10. A theory and a computational model of spatial reasoning with preferred mental models. (United States)

    Ragni, Marco; Knauff, Markus


    Inferences about spatial arrangements and relations like "The Porsche is parked to the left of the Dodge and the Ferrari is parked to the right of the Dodge, thus, the Porsche is parked to the left of the Ferrari," are ubiquitous. However, spatial descriptions are often interpretable in many different ways and compatible with several alternative mental models. This article suggests that individuals tackle such indeterminate multiple-model problems by constructing a single, simple, and typical mental model but neglect other possible models. The model that first comes to reasoners' minds is the preferred mental model. It helps save cognitive resources but also leads to reasoning errors and illusory inferences. The article presents a preferred model theory and an instantiation of this theory in the form of a computational model, preferred inferences in reasoning with spatial mental models (PRISM). PRISM can be used to simulate and explain how preferred models are constructed, inspected, and varied in a spatial array that functions as if it were a spatial working memory. A spatial focus inserts tokens into the array, inspects the array to find new spatial relations, and relocates tokens in the array to generate alternative models of the problem description, if necessary. The article also introduces a general measure of difficulty based on the number of necessary focus operations (rather than the number of models). A comparison with results from psychological experiments shows that the theory can explain preferences, errors, and the difficulty of spatial reasoning problems. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  11. Cold and hot cognition: quantum probability theory and realistic psychological modeling. (United States)

    Corr, Philip J


    Typically, human decision making is emotionally "hot" and does not conform to "cold" classical probability (CP) theory. As quantum probability (QP) theory emphasises order, context, superimposition states, and nonlinear dynamic effects, one of its major strengths may be its power to unify formal modeling and realistic psychological theory (e.g., information uncertainty, anxiety, and indecision, as seen in the Prisoner's Dilemma).

  12. Validating the 28-tender joint count using item response theory

    NARCIS (Netherlands)

    Tjin-Kam-Jet-Siemons, Liseth; ten Klooster, Peter M.; Taal, Erik; Kuper, I.H.; van Riel, Piet L.C.M.; van de Laar, Mart A F J; Glas, Cornelis A.W.


    Objective: To examine the construct validity of the 28-tender joint count (TJC-28) using item response theory (IRT)-based methods. Methods: A total of 457 patients with early stage rheumatoid arthritis (RA) were included. Internal construct validity of the TJC-28 was evaluated by determining whether

  13. Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients (United States)

    Andersson, Björn; Xin, Tao


    In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…

  14. Bad Questions: An Essay Involving Item Response Theory (United States)

    Thissen, David


    David Thissen, a professor in the Department of Psychology and Neuroscience, Quantitative Program at the University of North Carolina, has consulted and served on technical advisory committees for assessment programs that use item response theory (IRT) over the past couple decades. He has come to the conclusion that there are usually two purposes…

  15. Dynamic chemical process modelling and validation : Theory and application to industrial and literature case study

    NARCIS (Netherlands)

    Schmal, J.P.


    Dynamic chemical process modelling is still largely considered an art. In this thesis the theory of large-scale chemical process modelling and validation is discussed and initial steps to extend the theory are explored. In particular we pay attention to the effect of the level of detail on the model

  16. Density functional theory and dynamical mean-field theory. A way to model strongly correlated systems

    Energy Technology Data Exchange (ETDEWEB)

    Backes, Steffen


    The study of the electronic properties of correlated systems is a very diverse field and has lead to valuable insight into the physics of real materials. In these systems, the decisive factor that governs the physical properties is the ratio between the electronic kinetic energy, which promotes delocalization over the lattice, and the Coulomb interaction, which instead favours localized electronic states. Due to this competition, correlated electronic systems can show unique and interesting properties like the Metal-Insulator transition, diverse phase diagrams, strong temperature dependence and in general a high sensitivity to the environmental conditions. A theoretical description of these systems is not an easy task, since perturbative approaches that do not preserve the competition between the kinetic and interaction terms can only be applied in special limiting cases. One of the most famous approaches to obtain the electronic properties of a real material is the ab initio density functional theory (DFT) method. It allows one to obtain the ground state density of the system under investigation by mapping onto an effective non-interacting system that has to be found self-consistently. While being an exact theory, in practical implementations certain approximations have to be made to the exchange-correlation potential. The local density approximation (LDA), which approximates the exchange-correlation contribution to the total energy by that of a homogeneous electron gas with the corresponding density, has proven quite successful in many cases. Though, this approximation in general leads to an underestimation of electronic correlations and is not able to describe a metal-insulator transition due to electronic localization in the presence of strong Coulomb interaction. A different approach to the interacting electronic problem is the dynamical mean-field theory (DMFT), which is non-perturbative in the kinetic and interaction term but neglects all non

  17. Towards theory integration: Threshold model as a link between signal detection theory, fast-and-frugal trees and evidence accumulation theory. (United States)

    Hozo, Iztok; Djulbegovic, Benjamin; Luan, Shenghua; Tsalatsanis, Athanasios; Gigerenzer, Gerd


    Theories of decision making are divided between those aiming to help decision makers in the real, 'large' world and those who study decisions in idealized 'small' world settings. For the most part, these large- and small-world decision theories remain disconnected. We linked the small-world decision theoretic concepts of signal detection theory (SDT) and evidence accumulation theory (EAT) to the threshold model and the large world of heuristic decision making that rely on fast-and-frugal decision trees (FFT). We connected these large- and small-world theories by demonstrating that seemingly different decision-making concepts are actually equivalent. In doing so, we were able (1) to link the threshold model to EAT and FFT, thereby creating decision criteria that take into account both the classification accuracy of FFT and the consequences built in the threshold model; (2) to demonstrate how threshold criteria can be used as a strategy for optimal selection of cues when constructing FFT; and (3) to show that the compensatory strategy expressed in the threshold model can be linked to a non-compensatory FFT approach to decision making. We also showed how construction and performance of FFT depend on having reliable information - the results were highly sensitive to the estimates of benefits and harms of health interventions. We illustrate the practical usefulness of our analysis by describing an FFT we developed for prescribing statins for primary prevention of cardiovascular disease. By linking SDT and EAT to the compensatory threshold model and to non-compensatory heuristic decision making (FFT), we showed how these two decision strategies are ultimately linked within a broader theoretical framework and thereby respond to calls for integrating decision theory paradigms. © 2015 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  18. Queuing theory models used for port equipment sizing (United States)

    Dragu, V.; Dinu, O.; Ruscă, A.; Burciu, Ş.; Roman, E. A.


    The significant growth of volumes and distances on road transportation led to the necessity of finding solutions to increase water transportation market share together with the handling and transfer technologies within its terminals. It is widely known that the biggest times are consumed within the transport terminals (loading/unloading/transfer) and so the necessity of constantly developing handling techniques and technologies in concordance with the goods flows size so that the total waiting time of ships within ports is reduced. Port development should be achieved by harmonizing the contradictory interests of port administration and users. Port administrators aim profit increase opposite to users that want savings by increasing consumers’ surplus. The difficulty consists in the fact that the transport demand - supply equilibrium must be realised at costs and goods quantities transiting the port in order to satisfy the interests of both parties involved. This paper presents a port equipment sizing model by using queueing theory so that the sum of costs for ships waiting operations and equipment usage would be minimum. Ship operation within the port is assimilated to a mass service waiting system in which parameters are later used to determine the main costs for ships and port equipment.

  19. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis. (United States)

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul


    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  20. Multiscale modeling of lymphatic drainage from tissues using homogenization theory. (United States)

    Roose, Tiina; Swartz, Melody A


    Lymphatic capillary drainage of interstitial fluid under both steady-state and inflammatory conditions is important for tissue fluid balance, cancer metastasis, and immunity. Lymphatic drainage function is critically coupled to the fluid mechanical properties of the interstitium, yet this coupling is poorly understood. Here we sought to effectively model the lymphatic-interstitial fluid coupling and ask why the lymphatic capillary network often appears with roughly a hexagonal architecture. We use homogenization method, which allows tissue-scale lymph flow to be integrated with the microstructural details of the lymphatic capillaries, thus gaining insight into the functionality of lymphatic anatomy. We first describe flow in lymphatic capillaries using the Navier-Stokes equations and flow through the interstitium using Darcy's law. We then use multiscale homogenization to derive macroscale equations describing lymphatic drainage, with the mouse tail skin as a basis. We find that the limiting resistance for fluid drainage is that from the interstitium into the capillaries rather than within the capillaries. We also find that between hexagonal, square, and parallel tube configurations of lymphatic capillary networks, the hexagonal structure is the most efficient architecture for coupled interstitial and capillary fluid transport; that is, it clears the most interstitial fluid for a given network density and baseline interstitial fluid pressure. Thus, using homogenization theory, one can assess how vessel microstructure influences the macroscale fluid drainage by the lymphatics and demonstrate why the hexagonal network of dermal lymphatic capillaries is optimal for interstitial tissue fluid clearance. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Sustainable theory of a logistic model - Fisher information approach. (United States)

    Al-Saffar, Avan; Kim, Eun-Jin


    Information theory provides a useful tool to understand the evolution of complex nonlinear systems and their sustainability. In particular, Fisher information has been evoked as a useful measure of sustainability and the variability of dynamical systems including self-organising systems. By utilising Fisher information, we investigate the sustainability of the logistic model for different perturbations in the positive and/or negative feedback. Specifically, we consider different oscillatory modulations in the parameters for positive and negative feedback and investigate their effect on the evolution of the system and Probability Density Functions (PDFs). Depending on the relative time scale of the perturbation to the response time of the system (the linear growth rate), we demonstrate the maintenance of the initial condition for a long time, manifested by a broad bimodal PDF. We present the analysis of Fisher information in different cases and elucidate its implications for the sustainability of population dynamics. We also show that a purely oscillatory growth rate can lead to a finite amplitude solution while self-organisation of these systems can break down with an exponentially growing solution due to the periodic fluctuations in negative feedback. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz


    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  3. Game Theory and its Relationship with Linear Programming Models ...

    African Journals Online (AJOL)

    Game theory, a branch of operations research, has been successfully applied to solve various categories of problems arising from human decisions making characterized by the complexity of situations and the limits of individual processing abilities. This paper shows that game theory and linear programming problem are ...

  4. Non-static plane symmetric cosmological model in Wesson's theory

    Indian Academy of Sciences (India)

    in Wesson's theory. B MISHRA. The ICFAI Institute of Science and Technology, Fortune Towers A, Bhubaneswar 751 023, India. Email: MS received 29 ... scale invariant theory of gravitation with a time-dependent gauge function is investigated. The false ..... London A333, 403 (1973). [3] P A M Dirac ...

  5. Social Construction Theory and the Satir Model: Toward a Synthesis. (United States)

    Cheung, Maria


    Synthesizes social construction theory and the Satir approach to family therapy as a process of cocreation of reality, the use of language and narrative, and the therapist's role as a participant-facilitator. Presents a theory-building process of the Satir approach to family therapy. (Author/MKA)

  6. Relevance Theory as model for analysing visual and multimodal communication

    NARCIS (Netherlands)

    Forceville, C.; Machin, D.


    Elaborating on my earlier work (Forceville 1996: chapter 5, 2005, 2009; see also Yus 2008), I will here sketch how discussions of visual and multimodal discourse can be embedded in a more general theory of communication and cognition: Sperber and Wilson’s Relevance Theory/RT (Sperber and Wilson

  7. Naive Probability: A Mental Model Theory of Extensional Reasoning. (United States)

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul


    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  8. Applying Learning Theories and Instructional Design Models for Effective Instruction (United States)

    Khalil, Mohammed K.; Elkhider, Ihsan A.


    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning…

  9. Modeling and Performing Relational Theories in the Classroom (United States)

    Suter, Elizabeth A.; West, Carrie L.


    Although directly related to students' everyday lives, the abstract and even intimidating nature of relational theories often bars students from recognizing the immediate relevance to their relationships. The theories of symbolic interactionism, social exchange, relational dialectics, social penetration, and uncertainty reduction offer students…

  10. Models and theories of prescribing decisions: A review and suggested a new model (United States)

    Mohaidin, Zurina


    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701

  11. Nonparametric IRT analysis of Quality-of-Life Scales and its application to the World Health Organization Quality-of-Life Scale (WHOQOL-Bref). (United States)

    Sijtsma, Klaas; Emons, Wilco H M; Bouwmeester, Samantha; Nyklícek, Ivan; Roorda, Leo D


    This study investigates the usefulness of the nonparametric monotone homogeneity model for evaluating and constructing Health-Related Quality-of-Life Scales consisting of polytomous items, and compares it to the often-used parametric graded response model. The nonparametric monotone homogeneity model is a general model of which all known parametric models for polytomous items are special cases. Merits, drawbacks, and possibilities of nonparametric and parametric models and available software are discussed. Particular attention is given to the monotone homogeneity model (also known as the Mokken model), and the often-used parametric graded response model. Data from the WHOQOL-Bref were analyzed using both the monotone homogeneity model and the graded response model. The monotone homogeneity model analysis yielded unidimensional scales for each content domain. Scalability coefficients further showed that some items have limited scalability with respect to the other items in the same scale. The parametric IRT analyses lead to the rejection of some of the items. The nonparametric monotone homogeneity model is highly suited for data analysis in a health-related quality-of-life context, and the parametric graded response model may add interesting features to measurement provided the model fits the data well.

  12. Fitting measurement models to vocational interest data: are dominance models ideal? (United States)

    Tay, Louis; Drasgow, Fritz; Rounds, James; Williams, Bruce A


    In this study, the authors examined the item response process underlying 3 vocational interest inventories: the Occupational Preference Inventory (C.-P. Deng, P. I. Armstrong, & J. Rounds, 2007), the Interest Profiler (J. Rounds, T. Smith, L. Hubert, P. Lewis, & D. Rivkin, 1999; J. Rounds, C. M. Walker, et al., 1999), and the Interest Finder (J. E. Wall & H. E. Baker, 1997; J. E. Wall, L. L. Wise, & H. E. Baker, 1996). Item response theory (IRT) dominance models, such as the 2-parameter and 3-parameter logistic models, assume that item response functions (IRFs) are monotonically increasing as the latent trait increases. In contrast, IRT ideal point models, such as the generalized graded unfolding model, have IRFs that peak where the latent trait matches the item. Ideal point models are expected to fit better because vocational interest inventories ask about typical behavior, as opposed to requiring maximal performance. Results show that across all 3 interest inventories, the ideal point model provided better descriptions of the response process. The importance of specifying the correct item response model for precise measurement is discussed. In particular, scores computed by a dominance model were shown to be sometimes illogical: individuals endorsing mostly realistic or mostly social items were given similar scores, whereas scores based on an ideal point model were sensitive to which type of items respondents endorsed.

  13. Atmospheric boundary layers in storms: advanced theory and modelling applications

    Directory of Open Access Journals (Sweden)

    S. S. Zilitinkevich


    Full Text Available Turbulent planetary boundary layers (PBLs control the exchange processes between the atmosphere and the ocean/land. The key problems of PBL physics are to determine the PBL height, the momentum, energy and matter fluxes at the surface and the mean wind and scalar profiles throughout the layer in a range of regimes from stable and neutral to convective. Until present, the PBLs typical of stormy weather were always considered as neutrally stratified. Recent works have disclosed that such PBLs are in fact very strongly affected by the static stability of the free atmosphere and must be treated as factually stable (we call this type of the PBL "conventionally neutral" in contract to the "truly neutral" PBLs developed against the neutrally stratified free flow. It is common knowledge that basic features of PBLs exhibit a noticeable dependence on the free-flow static stability and baroclinicity. However, the concern of the traditional theory of neural and stable PBLs was almost without exception the barotropic nocturnal PBL, which develops at mid latitudes during a few hours in the night, on the background of a neutral or slightly stable residual layer. The latter separates this type of the PBL from the free atmosphere. It is not surprising that the nature of turbulence in such regimes is basically local and does not depend on the properties of the free atmosphere. Alternatively, long-lived neutral (in fact only conditionally neutral or stable PBLs, which have much more time to grow up, are placed immediately below the stably stratified free flow. Under these conditions, the turbulent transports of momentum and scalars even in the surface layer - far away from the PBL outer boundary - depend on the free-flow Brunt-Väisälä frequency, N. Furthermore, integral measures of the long-lived PBLs (their depths and the resistance law functions depend on N and also on the baroclinic shear, S. In the traditional PBL models both non-local parameters N and S

  14. Confirmatory factor analysis and item response theory analysis of the Whiteley Index. Results from a large population based study in Norway. The Hordaland Health Study (HUSK). (United States)

    Veddegjærde, Kari-Elise Frøystad; Sivertsen, Børge; Wilhelmsen, Ingvard; Skogen, Jens Christoffer


    The Whiteley Index (WI) is a widely used screening instrument for health anxiety/hypochondriasis. Several studies have previously explored the psychometric properties of the WI, but with mixed findings concerning both item composition and factor structure. The main aim of the current study was to examine different factor structures as identified from previous studies using data from a large general population based study. We also wanted to provide gender specific norms. Data were taken from a large population-based study in Norway, the Hordaland Health Study (HUSK N=7274). Confirmatory factor analysis (CFA) of several models of the WI was conducted. Item response theory (IRT) analysis was performed on the model with the best goodness-of-fit. CFA of all previously proposed factor models of the WI revealed clearly inadequate model fits. The IRT analysis suggested that a six-item model best described the data, and CFA confirmed an adequate goodness-of-fit across indices. The current study found evidence for a six-item, single-factor model of the WI. Our findings suggest that this abbreviated version has the best factor structure compared to previously proposed factor models. We recommend that the factor structure identified in this study should be investigated further in independent samples. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Supersymmetry and string theory beyond the standard model

    CERN Document Server

    Dine, Michael


    The past decade has witnessed dramatic developments in the fields of experimental and theoretical particle physics and cosmology. This fully updated second edition is a comprehensive introduction to these recent developments and brings this self-contained textbook right up to date. Brand new material for this edition includes the groundbreaking Higgs discovery, results of the WMAP and Planck experiments. Extensive discussion of theories of dynamical electroweak symmetry breaking and a new chapter on the landscape, as well as a completely rewritten coda on future directions gives readers a modern perspective on this developing field. A focus on three principle areas: supersymmetry, string theory, and astrophysics and cosmology provide the structure for this book which will be of great interest to graduates and researchers in the fields of particle theory, string theory, astrophysics and cosmology. The book contains several problems, and password-protected solutions will be available to lecturers at www.cambrid...

  16. Construction of a memory battery for computerized administration, using item response theory. (United States)

    Ferreira, Aristides I; Almeida, Leandro S; Prieto, Gerardo


    In accordance with Item Response Theory, a computer memory battery with six tests was constructed for use in the Portuguese adult population. A factor analysis was conducted to assess the internal structure of the tests (N = 547 undergraduate students). According to the literature, several confirmatory factor models were evaluated. Results showed better fit of a model with two independent latent variables corresponding to verbal and non-verbal factors, reproducing the initial battery organization. Internal consistency reliability for the six tests were alpha = .72 to .89. IRT analyses (Rasch and partial credit models) yielded good Infit and Outfit measures and high precision for parameter estimation. The potential utility of these memory tasks for psychological research and practice willbe discussed.

  17. GPR and IRT tests in two historical buildings in Gravina in Puglia (United States)

    Matera, Loredana; Persico, Raffaele; Geraldi, Edoardo; Sileo, Maria; Piro, Salvatore


    This paper describes a geophysical investigation conducted into two important churches, namely the Cathedral of Santa Maria Assunta and the Church of Santa Croce, both in Gravina in Puglia (close to Bari, southern Italy). The Church of Santa Croce, now deconsecrated, lies below the cathedral. Therefore, the two churches constitute a unique building body. Moreover, below the Church of Santa Croce there are several crypts, which are only partially known. The prospecting was performed both with a pulsed commercial ground penetrating radar (GPR) system and with a prototypal reconfigurable stepped frequency system. The aim was twofold, namely to gather information about the monument and to test the prototypal system. The GPR measurements have also been integrated with an infrared thermography (IRT) investigation performed on part of the vaulted ceiling in the Church of Santa Croce, in order to confirm or deny a possible interpretation of certain GPR results.

  18. The EXIST optical and infra-red telescope (IRT) and imager-spectrometer (United States)

    Kutyrev, A. S.; Moseley, S. H.; Golisano, C.; Gong, Q.; Allen, B. T.; Gehrels, N.; Grindlay, J. E.; Hong, J. S.; Woodgate, B. E.


    The EXIST (Energetic X-ray Imaging Survey Telescope) mission includes the 1.1 m optical Infra-Red Telescope (IRT) which provides the capability to locate, identify, and obtain spectra of transient events, in particular GRB afterglows at redshifts up to epoch of reionization. The instrument includes a high spatial resolution imager, low spectral resolution spectrometer (R~ 30) and high resolution slit spectrometer (R~ 3000). This instrument, with the observatory's rapid reaction response will quickly identify the GRB afterglow, measure its brightness curves, redshift, measure spectral characteristics of the afterglows and measure absorption spectra of the intervening intergalactic medium. With this instrument, high redshift GRBs become important tools for studying the growth of structure, observing the processes through which the universe is reionized.

  19. Developing a short form of Benton's Judgment of Line Orientation Test: an item response theory approach. (United States)

    Calamia, Matthew; Markon, Kristian; Denburg, Natalie L; Tranel, Daniel


    The Judgment of Line Orientation (JLO) test was developed to be, in Arthur Benton's words, "as pure a measure of one aspect of spatial thinking, as could be conceived" (Benton, 1994, p. 53). The JLO test has been widely used in neuropsychological practice for decades. The test has a high test-retest reliability (Franzen, 2000), as well as good neuropsychological construct validity as shown through neuroanatomical localization studies (Tranel, Vianna, Manzel, Damasio, & Grabowski, 2009). Despite its popularity and strong psychometric properties, the full-length version of the test (30 items) has been criticized as being unnecessarily long (Straus, Sherman, & Spreen, 2006). There have been many attempts at developing short forms; however, these forms have been limited in their ability to estimate scores accurately. Taking advantage of a large sample of JLO performances from 524 neurological patients with focal brain lesions, we used techniques from item response theory (IRT) to estimate each item's difficulty and power to discriminate among various levels of ability. A random item IRT model was used to estimate the influence of item stimulus properties as predictors of item difficulty. These results were used to optimize the selection of items for a shorter method of administration that maintained comparability with the full form using significantly fewer items. This effectiveness of this method was replicated in a second sample of 82 healthy elderly participants. The findings should help broaden the clinical utility of the JLO and enhance its diagnostic applications.

  20. Maximum Likelihood Item Easiness Models for Test Theory without an Answer Key (United States)

    France, Stephen L.; Batchelder, William H.


    Cultural consensus theory (CCT) is a data aggregation technique with many applications in the social and behavioral sciences. We describe the intuition and theory behind a set of CCT models for continuous type data using maximum likelihood inference methodology. We describe how bias parameters can be incorporated into these models. We introduce…

  1. Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items (United States)

    Aybek, Eren Can; Demirtasli, R. Nukhet


    This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…

  2. Design of Learning Model of Logic and Algorithms Based on APOS Theory (United States)

    Hartati, Sulis Janu


    This research questions were "how do the characteristics of learning model of logic & algorithm according to APOS theory" and "whether or not these learning model can improve students learning outcomes". This research was conducted by exploration, and quantitative approach. Exploration used in constructing theory about the…

  3. Short-Range Correlation Models in Electronic Structure Theory (United States)

    Goldey, Matthew Bryant

    fraction of MP2/CBS computational cost. Second, attenuated MP2 is developed within the larger aug-cc-pVTZ (aTZ) basis set for inter- and intramolecular non-bonded interactions. A single attenuation parameter is optimized on the S66 database of 66 intermolecular interactions, leading to a very large RMS error reduction by a factor of greater than 5 relative to standard MP2/aTZ. Attenuation introduces an error of opposite sign to basis set superposition error (BSSE) and overestimation of dispersion interactions in finite basis MP2. A variety of tests including the S22 set, conformer energies of peptides, alkanes, sugars, sulfate-water clusters, and the coronene dimer establish the transferability of the MP2(terfc, aTZ) model to other inter and intra-molecular interactions. Direct comparisons against attenuation in the smaller aug-cc-pVDZ basis shows that MP2(terfc, aTZ) often significantly outperforms MP2(terfc, aDZ), although at higher computational cost. MP2(terfc, aDZ) and MP2(terfc, aTZ) often outperform MP2 at the complete basis set limit. Comparison of the two attenuated MP2 models against each other and against attenuation using non-augmented basis sets gives insight into the error cancellation responsible for their remarkable success. Third, I present an improved algorithm for single-node multi-threaded computation of the correlation energy using resolution of the identity second-order Moller-Plesset perturbation theory (RI-MP2). This algorithm is based on shared memory parallelization of the rate-limiting steps and an overall reduction in the number of disk reads. The requisite fifth-order computation in RI-MP2 calculations is efficiently parallelized within this algorithm, with improvements in overall parallel efficiency as the system size increases. Fourth-order steps are also parallelized. As an application, I present energies and timings for several large, noncovalently interacting systems with this algorithm, and demonstrate that the RI-MP2 cost is still

  4. An anthology of theories and models of design philosophy, approaches and empirical explorations

    CERN Document Server

    Blessing, Lucienne


    While investigations into both theories and models has remained a major strand of engineering design research, current literature sorely lacks a reference book that provides a comprehensive and up-to-date anthology of theories and models, and their philosophical and empirical underpinnings; An Anthology of Theories and Models of Design fills this gap. The text collects the expert views of an international authorship, covering: ·         significant theories in engineering design, including CK theory, domain theory, and the theory of technical systems; ·         current models of design, from a function behavior structure model to an integrated model; ·         important empirical research findings from studies into design; and ·         philosophical underpinnings of design itself. For educators and researchers in engineering design, An Anthology of Theories and Models of Design gives access to in-depth coverage of theoretical and empirical developments in this area; for pr...

  5. Fuzzy Field Theory as a Random Matrix Model (United States)

    Tekel, Juraj

    This dissertation considers the theory of scalar fields on fuzzy spaces from the point of view of random matrices. First we define random matrix ensembles, which are natural description of such theory. These ensembles are new and the novel feature is a presence of kinetic term in the probability measure, which couples the random matrix to a set of external matrices and thus breaks the original symmetry. Considering the case of a free field ensemble, which is generalization of a Gaussian matrix ensemble, we develop a technique to compute expectation values of the observables of the theory based on explicit Wick contractions and we write down recursion rules for these. We show that the eigenvalue distribution of the random matrix follows the Wigner semicircle distribution with a rescaled radius. We also compute distributions of the matrix Laplacian of the random matrix given by the new term and demonstrate that the eigenvalues of these two matrices are correlated. We demonstrate the robustness of the method by computing expectation values and distributions for more complicated observables. We then consider the ensemble corresponding to an interacting field theory, with a quartic interaction. We use the same method to compute the distribution of the eigenvalues and show that the presence of the kinetic terms rescales the distribution given by the original theory, which is a polynomially deformed Wigner semicircle. We compute the eigenvalue distribution of the matrix Laplacian and the joint distribution up to second order in the correlation and we show that the correlation between the two changes from the free field case. Finally, as an application of these results, we compute the phase diagram of the fuzzy scalar field theory, we find multiscaling which stabilizes this diagram in the limit of large matrices and compare it with the results obtained numerically and by considering the kinetic part as a perturbation.

  6. Performance of the Family Satisfaction with the End-of-Life Care (FAMCARE) measure in an ethnically diverse cohort: psychometric analyses using item response theory. (United States)

    Teresi, Jeanne A; Ornstein, Katherine; Ocepek-Welikson, Katja; Ramirez, Mildred; Siu, Albert


    The Family Satisfaction with End-of-Life Care (FAMCARE) has been used widely among caregivers to individuals with cancer. The aim of this study was to evaluate the psychometric properties of this measure using item response theory (IRT). The analytic sample was comprised of caregivers to 1,983 patients with advanced cancer. Among the patients, 56 % were females, with mean age 59.9 years (s.d. = 11.8), 20 % were non-Hispanic Black. The majority were family members either living with (44 %) or not living with (35 %) the patient. Factor analyses and IRT were used to examine the dimensionality, information, and reliability of the FAMCARE. Although a bi-factor model fit the data slightly better than did a unidimensional model, the loadings on the group factors were very low. Thus, a unidimensional model appears to provide adequate representation for the item set. The reliability estimates, calculated along the satisfaction (theta) continuum, were adequate (>0.80) for all levels of theta for which subjects had scores. Examination of the category response functions from IRT showed overlap in the lower categories with little unique information provided; moreover, the categories were not observed to be interval. Based on these analyses, a three-response category format was recommended: very satisfied, satisfied, and not satisfied. Most information was provided in the range indicative of either dissatisfaction or high satisfaction. These analyses support the use of fewer response categories and provide item parameters that form a basis for developing shorter-form scales. Such a revision has the potential to reduce respondent burden.

  7. A Policy-Capturing Investigation of Expectancy Theory Models of Valence and Force. (United States)


    1954) need hierarchy, Herzberg’s (1959) two-factor theory , and Alderfer’s (1972) ERG theory are three of the most widely publicized and researched...AO-AO83 714 AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOO-ETC F/G 5/10 PC4LICY-CAPTURING INVESTIGATION OF EXPECTANCY THEORY MODELS OF--ETC(U...INVESTIGATION OF EXPECTANCY THEORY MODELS OF VALENCE AND FORCE Thesis Norbert C. Wagner, Jr. Captain USAF AFIT/GSM/SM/79D-21 / C..) Approved for public

  8. The accuracy of the Life Orientation Test-Revised (LOT-R) in measuring dispositional optimism: evidence from item response theory analyses. (United States)

    Chiesi, Francesca; Galli, Silvia; Primi, Caterina; Innocenti Borgi, Paolo; Bonacchi, Andrea


    The accuracy of the Life Orientation Test-Revised (LOT-R) in measuring dispositional optimism was investigated applying item response theory (IRT). The study was conducted on a sample of 484 university students (62% males, M age = 22.79 years, SD = 5.63). After testing the 1-factor structure of the scale, IRT was applied to evaluate the functioning of the LOT-R along the pessimism-optimism continuum. Item parameter estimates and the test information function showed that each item and the global scale satisfactorily measured the latent trait. Referring to the IRT estimated trait levels, the validity of the LOT-R was studied examining the relationships between dispositional optimism and psychological well-being, sense of mastery, and sense of coherence. Overall findings based on IRT analyses provide evidence of the accuracy of the LOT-R and suggest possible modifications of the scale to improve the assessment of dispositional optimism.

  9. Quantitative Robust Control Engineering: Theory and Applications (United States)


    to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...the D tank as a dashed line [mg/l] and the control input IR(t) is shown as a solid line [ per unit of the influent flow rate] with the G22(z...1992). Discrete quantitative feedback technique, Capítulo 16 en el libro : Digital Control Systems: theory, hardware, software, 2ª edicion. McGraw

  10. Canonical quantization of the WZW model with defects and Chern-Simons theory

    DEFF Research Database (Denmark)

    Sarkissian, Gor


    We perform canonical quantization of the WZW model with defects and permutation branes. We establish symplectomorphism between phase space of WZW model with $N$ defects on cylinder and phase space of Chern-Simons theory on annulus times $R$ with $N$ Wilson lines, and between phase space of WZW...... model with $N$ defects on strip and Chern-Simons theory on disc times $R$ with $N+2$ Wilson lines. We obtained also symplectomorphism between phase space of the $N$-fold product of the WZW model with boundary conditions specified by permutation branes, and phase space of Chern-Simons theory on sphere...

  11. I can do that: the impact of implicit theories on leadership role model effectiveness. (United States)

    Hoyt, Crystal L; Burnette, Jeni L; Innella, Audrey N


    This research investigates the role of implicit theories in influencing the effectiveness of successful role models in the leadership domain. Across two studies, the authors test the prediction that incremental theorists ("leaders are made") compared to entity theorists ("leaders are born") will respond more positively to being presented with a role model before undertaking a leadership task. In Study 1, measuring people's naturally occurring implicit theories of leadership, the authors showed that after being primed with a role model, incremental theorists reported greater leadership confidence and less anxious-depressed affect than entity theorists following the leadership task. In Study 2, the authors demonstrated the causal role of implicit theories by manipulating participants' theory of leadership ability. They replicated the findings from Study 1 and demonstrated that identification with the role model mediated the relationship between implicit theories and both confidence and affect. In addition, incremental theorists outperformed entity theorists on the leadership task.

  12. Testing static tradeoff theory against pecking order models of capital ...

    African Journals Online (AJOL)

    By using ordinary least square multiple regression methods, we aim at establishing which of the two theories has the best explanatory power for Nigerian firms. The analysis of the outcomes led to the conclusion that both of them appears to be a good description of the financing policies of those firms for the period under ...

  13. Speech act theory in support of idealised warning models | Carstens ...

    African Journals Online (AJOL)

    In applied communication studies warnings (as components of instructional texts) are often characterised in terms of criteria for effectiveness. ... and evaluate actual warnings collected from information sheets for hair-dryers, indicating the heuristic value of combined insights from document design and speech act theory.

  14. Towards a Theory of University Entrepreneurship: Developing a Theoretical Model (United States)

    Woollard, David


    This paper sets out to develop a robust theory in a largely atheoretical field of study. The increasing importance of entrepreneurship in delivering the "Third Mission" calls for an enhanced understanding of the university entrepreneurship phenomenon, not solely as a subject of academic interest but also to guide the work of practitioners in the…

  15. Buckled graphene: A model study based on density functional theory (United States)

    Khan, M. A.; Mukaddam, M. A.; Schwingenschlögl, U.


    We make use of ab initio calculations within density functional theory to investigate the influence of buckling on the electronic structure of single layer graphene. Our systematic study addresses a wide range of bond length and bond angle variations in order to obtain insights into the energy scale associated with the formation of ripples in a graphene sheet.

  16. An Energy Model for Viewing Embodied Human Capital Theory (United States)

    Kaufman, Neil A.; Geroy, Gary D.


    Human capital development is one of the emerging areas of study with regard to social science theory, practice, and research. A relatively new concept, human capital is described in terms of individual knowledge skills and experience. It is currently expressed as a function of education as well as a measure of economic activity. Little theory…

  17. Using classical test theory, item response theory, and Rasch measurement theory to evaluate patient-reported outcome measures: a comparison of worked examples. (United States)

    Petrillo, Jennifer; Cano, Stefan J; McLeod, Lori D; Coon, Cheryl D


    To provide comparisons and a worked example of item- and scale-level evaluations based on three psychometric methods used in patient-reported outcome development-classical test theory (CTT), item response theory (IRT), and Rasch measurement theory (RMT)-in an analysis of the National Eye Institute Visual Functioning Questionnaire (VFQ-25). Baseline VFQ-25 data from 240 participants with diabetic macular edema from a randomized, double-masked, multicenter clinical trial were used to evaluate the VFQ at the total score level. CTT, RMT, and IRT evaluations were conducted, and results were assessed in a head-to-head comparison. Results were similar across the three methods, with IRT and RMT providing more detailed diagnostic information on how to improve the scale. CTT led to the identification of two problematic items that threaten the validity of the overall scale score, sets of redundant items, and skewed response categories. IRT and RMT additionally identified poor fit for one item, many locally dependent items, poor targeting, and disordering of over half the response categories. Selection of a psychometric approach depends on many factors. Researchers should justify their evaluation method and consider the intended audience. If the instrument is being developed for descriptive purposes and on a restricted budget, a cursory examination of the CTT-based psychometric properties may be all that is possible. In a high-stakes situation, such as the development of a patient-reported outcome instrument for consideration in pharmaceutical labeling, however, a thorough psychometric evaluation including IRT or RMT should be considered, with final item-level decisions made on the basis of both quantitative and qualitative results. Copyright © 2015. Published by Elsevier Inc.

  18. Average vs item response theory scores: an illustration using neighbourhood measures in relation to physical activity in adults with arthritis. (United States)

    Mielenz, T J; Callahan, L F; Edwards, M C


    Our study had two main objectives: 1) to determine whether perceived neighbourhood physical features are associated with physical activity levels in adults with arthritis; and 2) to determine whether the conclusions are more precise when item response theory (IRT) scores are used instead of average scores for the perceived neighbourhood physical features scales. Information on health outcomes, neighbourhood characteristics, and physical activity levels were collected using a telephone survey of 937 participants with self-reported arthritis. Neighbourhood walkability and aesthetic features and physical activity levels were measured by self-report. Adjusted proportional odds models were constructed separately for each neighbourhood physical features scale. We found that among adults with arthritis, poorer perceived neighbourhood physical features (both walkability and aesthetics) are associated with decreased physical activity level compared to better perceived neighbourhood features. This association was only observed in our adjusted models when IRT scoring was employed with the neighbourhood physical feature scales (walkability scale: odds ratio [OR] 1.20, 95% confidence interval [CI] 1.02, 1.41; aesthetics scale: OR 1.32, 95% CI 1.09, 1.62), not when average scoring was used (walkability scale: OR 1.14, 95% CI 1.00, 1.30; aesthetics scale: OR 1.16, 95% CI 1.00, 1.36). In adults with arthritis, those reporting poorer walking and aesthetics features were found to have decreased physical activity levels compared to those reporting better features when IRT scores were used, but not when using average scores. This study may inform public health physical environmental interventions implemented to increase physical activity, especially since arthritis prevalence is expected to be close to 20% of the population in 2020. Based on NIH initiatives, future health research will utilize IRT scores. The differences found in this study may be a precursor for research on how past

  19. Growing up and Role Modeling: A Theory in Iranian Nursing Students? Education


    Nouri, Jamileh Mokhtari; Ebadi, Abbas; Alhani, Fatemeh; Rejeh, Nahid


    One of the key strategies in students? learning is being affected by models. Understanding the role-modeling process in education will help to make greater use of this training strategy. The aim of this grounded theory study was to explore Iranian nursing students and instructors? experiences about role modeling process. Data was analyzed by Glaserian?s Grounded Theory methodology through semi-structured interviews with 7 faculty members, 2 nursing students; the three focus group discussions ...

  20. Lyapunov stability in an evolutionary game theory model of the labour market

    Directory of Open Access Journals (Sweden)

    Ricardo Azevedo Araujo


    Full Text Available In this paper the existence and stability of equilibriums in an evolutionary game theory model of the labour market is studied by using the Lyapunov method. The model displays multiple equilibriums and it is shown that the Nash equilibriums of the static game are evolutionary stable equilibrium in the game theory evolutionary set up. A complete characterization of the dynamics of an evolutionary model of the labour market is provided.