WorldWideScience

Sample records for theory instrumentation model

  1. [Instrument to measure adherence in hypertensive patients: contribution of Item Response Theory].

    Science.gov (United States)

    Rodrigues, Malvina Thaís Pacheco; Moreira, Thereza Maria Magalhaes; Vasconcelos, Alexandre Meira de; Andrade, Dalton Francisco de; Silva, Daniele Braz da; Barbetta, Pedro Alberto

    2013-06-01

    To analyze, by means of "Item Response Theory", an instrument to measure adherence to t treatment for hypertension. Analytical study with 406 hypertensive patients with associated complications seen in primary care in Fortaleza, CE, Northeastern Brazil, 2011 using "Item Response Theory". The stages were: dimensionality test, calibrating the items, processing data and creating a scale, analyzed using the gradual response model. A study of the dimensionality of the instrument was conducted by analyzing the polychoric correlation matrix and factor analysis of complete information. Multilog software was used to calibrate items and estimate the scores. Items relating to drug therapy are the most directly related to adherence while those relating to drug-free therapy need to be reworked because they have less psychometric information and low discrimination. The independence of items, the small number of levels in the scale and low explained variance in the adjustment of the models show the main weaknesses of the instrument analyzed. The "Item Response Theory" proved to be a relevant analysis technique because it evaluated respondents for adherence to treatment for hypertension, the level of difficulty of the items and their ability to discriminate between individuals with different levels of adherence, which generates a greater amount of information. The instrument analyzed is limited in measuring adherence to hypertension treatment, by analyzing the "Item Response Theory" of the item, and needs adjustment. The proper formulation of the items is important in order to accurately measure the desired latent trait.

  2. Discrete-time modelling of musical instruments

    International Nuclear Information System (INIS)

    Vaelimaeki, Vesa; Pakarinen, Jyri; Erkut, Cumhur; Karjalainen, Matti

    2006-01-01

    This article describes physical modelling techniques that can be used for simulating musical instruments. The methods are closely related to digital signal processing. They discretize the system with respect to time, because the aim is to run the simulation using a computer. The physics-based modelling methods can be classified as mass-spring, modal, wave digital, finite difference, digital waveguide and source-filter models. We present the basic theory and a discussion on possible extensions for each modelling technique. For some methods, a simple model example is chosen from the existing literature demonstrating a typical use of the method. For instance, in the case of the digital waveguide modelling technique a vibrating string model is discussed, and in the case of the wave digital filter technique we present a classical piano hammer model. We tackle some nonlinear and time-varying models and include new results on the digital waveguide modelling of a nonlinear string. Current trends and future directions in physical modelling of musical instruments are discussed

  3. Data, instruments, and theory a dialectical approach to understanding science

    CERN Document Server

    Ackermann, Robert John

    1985-01-01

    Robert John Ackermann deals decisively with the problem of relativism that has plagued post-empiricist philosophy of science. Recognizing that theory and data are mediated by data domains (bordered data sets produced by scientific instruments), he argues that the use of instruments breaks the dependency of observation on theory and thus creates a reasoned basis for scientific objectivity.

  4. Understanding practice change in community pharmacy: a qualitative research instrument based on organisational theory.

    Science.gov (United States)

    Roberts, Alison S; Hopp, Trine; Sørensen, Ellen Westh; Benrimoj, Shalom I; Chen, Timothy F; Herborg, Hanne; Williams, Kylie; Aslani, Parisa

    2003-10-01

    The past decade has seen a notable shift in the practice of pharmacy, with a strong focus on the provision of cognitive pharmaceutical services (CPS) by community pharmacists. The benefits of these services have been well documented, yet their uptake appears to be slow. Various strategies have been developed to overcome barriers to the implementation of CPS, with varying degrees of success, and little is known about the sustainability of the practice changes they produce. Furthermore, the strategies developed are often specific to individual programs or services, and their applicability to other CPS has not been explored. There seems to be a need for a flexible change management model for the implementation and dissemination of a range of CPS, but before it can be developed, a better understanding of the change process is required. This paper describes the development of a qualitative research instrument that may be utilised to investigate practice change in community pharmacy. Specific objectives included gaining knowledge about the circumstances surrounding attempts to implement CPS, and understanding relationships that are important to the change process. Organisational theory provided the conceptual framework for development of the qualitative research instrument, within which two theories were used to give insight into the change process: Borum's theory of organisational change, which categorizes change strategies as rational, natural, political or open; and Social Network Theory, which helps identify and explain the relationships between key people involved in the change process. A semi-structured affecting practice change found in the literature that warranted further investigation with the theoretical perspectives of organisational change and social networks. To address the research objectives, the instrument covered four broad themes: roles, experiences, strategies and networks. The qualitative research instrument developed in this study provides a

  5. Instrumental traditions and theories of light the uses of instruments in the optical revolution

    CERN Document Server

    Chen, Xiang

    2000-01-01

    An analysis of the optical revolution in the context of early 19th century Britain. Far from merely involving the replacement of one optical theory by another, the revolution also involved substantial changes in instruments and the practices that surrounded them. People's judgements about classification, explanation and evaluation were affected by the way they used such optical instruments as spectroscopes, telescopes, polarisers, photometers, gratings, prisms and apertures. There were two instrumental traditions in this historical period, each of which nurtured a body of practice that exemplified how optical instruments should be operated, and especially how the eye should be used. These traditions functioned just like paradigms, shaping perspectives and even world views. Readership: Scholars and graduate students in the history of science, history of instrument, philosophy of science and science studies. Can also be used as a textbook in graduate courses on 19th century physics.

  6. [Health promotion. Instrument development for the application of the theory of planned behavior].

    Science.gov (United States)

    Lee, Y O

    1993-01-01

    The purpose of this article is to describe operationalization of the Theory of Planned Behavior (TPB). The quest to understand determinants of health behaviors has intensified as evidence accumulates concerning the impact of personal behavior on health. The majority of theory-based research has used the Health Belief Model(HBM). The HBM components have had limited success in explaining health-related behaviors. There are several advantages of the TPB over the HBM. TPB is an expansion of the Theory of Reasoned Action(TRA) with the addition of the construct, perceived behavioral control. The revised model has been shown to yield greater explanatory power than the original TRA for goal-directed behaviors. The process of TPB instrument development was described, using example form the study of smoking cessation behavior in military smokers. It was followed by a discussion of reliability and validity issues in operationalizing the TPB. The TPB is a useful model for understanding and predicting health-related behaviors when carefully operationalized. The model holds promise in the development of prescriptive nursing approaches.

  7. A hierarchical instrumental decision theory of nicotine dependence.

    Science.gov (United States)

    Hogarth, Lee; Troisi, Joseph R

    2015-01-01

    It is important to characterize the learning processes governing tobacco-seeking in order to understand how best to treat this behavior. Most drug learning theories have adopted a Pavlovian framework wherein the conditioned response is the main motivational process. We favor instead a hierarchical instrumental decision account, wherein expectations about the instrumental contingency between voluntary tobacco-seeking and the receipt of nicotine reward determines the probability of executing this behavior. To support this view, we review titration and nicotine discrimination research showing that internal signals for deprivation/satiation modulate expectations about the current incentive value of smoking, thereby modulating the propensity of this behavior. We also review research on cue-reactivity which has shown that external smoking cues modulate expectations about the probability of the tobacco-seeking response being effective, thereby modulating the propensity of this behavior. Economic decision theory is then considered to elucidate how expectations about the value and probability of response-nicotine contingency are integrated to form an overall utility estimate for that option for comparison with qualitatively different, nonsubstitute reinforcers, to determine response selection. As an applied test for this hierarchical instrumental decision framework, we consider how well it accounts for individual liability to smoking uptake and perseveration, pharmacotherapy, cue-extinction therapies, and plain packaging. We conclude that the hierarchical instrumental account is successful in reconciling this broad range of phenomenon precisely because it accepts that multiple diverse sources of internal and external information must be integrated to shape the decision to smoke.

  8. Developing a theory-based instrument to assess the impact of continuing professional development activities on clinical practice: a study protocol

    Directory of Open Access Journals (Sweden)

    Rousseau Michel

    2011-03-01

    Full Text Available Abstract Background Continuing professional development (CPD is one of the principal means by which health professionals (i.e. primary care physicians and specialists maintain, improve, and broaden the knowledge and skills required for optimal patient care and safety. However, the lack of a widely accepted instrument to assess the impact of CPD activities on clinical practice thwarts researchers' comparisons of the effectiveness of CPD activities. Using an integrated model for the study of healthcare professionals' behaviour, our objective is to develop a theory-based, valid, reliable global instrument to assess the impact of accredited CPD activities on clinical practice. Methods Phase 1: We will analyze the instruments identified in a systematic review of factors influencing health professionals' behaviours using criteria that reflect the literature on measurement development and CPD decision makers' priorities. The outcome of this phase will be an inventory of instruments based on social cognitive theories. Phase 2: Working from this inventory, the most relevant instruments and their related items for assessing the concepts listed in the integrated model will be selected. Through an e-Delphi process, we will verify whether these instruments are acceptable, what aspects need revision, and whether important items are missing and should be added. The outcome of this phase will be a new global instrument integrating the most relevant tools to fit our integrated model of healthcare professionals' behaviour. Phase 3: Two data collections are planned: (1 a test-retest of the new instrument, including item analysis, to assess its reliability and (2 a study using the instrument before and after CPD activities with a randomly selected control group to explore the instrument's mere-measurement effect. Phase 4: We will conduct individual interviews and focus groups with key stakeholders to identify anticipated barriers and enablers for implementing the

  9. Theory, modeling and instrumentation for materials by design: Proceedings of workshop

    Energy Technology Data Exchange (ETDEWEB)

    Allen, R.E.; Cocke, D.L.; Eberhardt, J.J.; Wilson, A. (eds.)

    1984-01-01

    The following topics are contained in this volume: how can materials theory benefit from supercomputers and vice-versa; the materials of xerography; relationship between ab initio and semiempirical theories of electronic structure and renormalization group and the statistical mechanics of polymer systems; ab initio calculations of materials properties; metals in intimate contact; lateral interaction in adsorption: revelations from phase transitions; quantum model of thermal desorption and laser stimulated desorption; extended fine structure in appearance potential spectroscopy as a probe of solid surfaces; structural aspects of band offsets at heterojunction interfaces; multiconfigurational Green's function approach to quantum chemistry; wavefunctions and charge densities for defects in solids: a success for semiempirical theory; empirical methods for predicting the phase diagrams of intermetallic alloys; theoretical considerations regarding impurities in silicon and the chemisorption of simple molecules on Ni; improved Kohn-Sham exchange potential; structural stability calculations for films and crystals; semiempirical molecular orbital modeling of catalytic reactions including promoter effects; theoretical studies of chemical reactions: hydrolysis of formaldehyde; electronic structure calculations for low coverage adlayers; present status of the many-body problem; atomic scattering as a probe of physical adsorption; and, discussion of theoretical techniques in quantum chemistry and solid state physics.

  10. 2PI effective action for the SYK model and tensor field theories

    Science.gov (United States)

    Benedetti, Dario; Gurau, Razvan

    2018-05-01

    We discuss the two-particle irreducible (2PI) effective action for the SYK model and for tensor field theories. For the SYK model the 2PI effective action reproduces the bilocal reformulation of the model without using replicas. In general tensor field theories the 2PI formalism is the only way to obtain a bilocal reformulation of the theory, and as such is a precious instrument for the identification of soft modes and for possible holographic interpretations. We compute the 2PI action for several models, and push it up to fourth order in the 1 /N expansion for the model proposed by Witten in [1], uncovering a one-loop structure in terms of an auxiliary bilocal action.

  11. netherland hydrological modeling instrument

    Science.gov (United States)

    Hoogewoud, J. C.; de Lange, W. J.; Veldhuizen, A.; Prinsen, G.

    2012-04-01

    Netherlands Hydrological Modeling Instrument A decision support system for water basin management. J.C. Hoogewoud , W.J. de Lange ,A. Veldhuizen , G. Prinsen , The Netherlands Hydrological modeling Instrument (NHI) is the center point of a framework of models, to coherently model the hydrological system and the multitude of functions it supports. Dutch hydrological institutes Deltares, Alterra, Netherlands Environmental Assessment Agency, RWS Waterdienst, STOWA and Vewin are cooperating in enhancing the NHI for adequate decision support. The instrument is used by three different ministries involved in national water policy matters, for instance the WFD, drought management, manure policy and climate change issues. The basis of the modeling instrument is a state-of-the-art on-line coupling of the groundwater system (MODFLOW), the unsaturated zone (metaSWAP) and the surface water system (MOZART-DM). It brings together hydro(geo)logical processes from the column to the basin scale, ranging from 250x250m plots to the river Rhine and includes salt water flow. The NHI is validated with an eight year run (1998-2006) with dry and wet periods. For this run different parts of the hydrology have been compared with measurements. For instance, water demands in dry periods (e.g. for irrigation), discharges at outlets, groundwater levels and evaporation. A validation alone is not enough to get support from stakeholders. Involvement from stakeholders in the modeling process is needed. There fore to gain sufficient support and trust in the instrument on different (policy) levels a couple of actions have been taken: 1. a transparent evaluation of modeling-results has been set up 2. an extensive program is running to cooperate with regional waterboards and suppliers of drinking water in improving the NHI 3. sharing (hydrological) data via newly setup Modeling Database for local and national models 4. Enhancing the NHI with "local" information. The NHI is and has been used for many

  12. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  13. The social conditions of instrumental action: Problems in the sociological understanding of rational choice theory

    Directory of Open Access Journals (Sweden)

    Bruno Sciberras de Carvalho

    2008-01-01

    Full Text Available This article critically analyzes new sociological approaches to the rational choice theory which - beyond examining political or economic practices - link the notion of instrumental rationality to social issues and themes. The article begins by highlighting the issue of trust, indicating the functionality of certain social arrangements in collective problem-solving. The paper goes on to demonstrate that problems emerge with the theory when it attempts to explain the feasibility of social norms in impersonal, comprehensive contexts. Thus, the fundamental point that appears to be missing from rational choice theory is the perception that individual decisions and instrumental conduct itself incorporate dispositions that in a sense are beyond the actors' control.

  14. Developing and testing a positive theory of instrument choice: Renewable energy policy in the fifty American states

    Science.gov (United States)

    Ciocirlan, Cristina E.

    The environmental economics literature consistently suggests that properly designed and implemented economic incentives are superior to command-and-control regulation in reducing pollution. Economic incentives, such as green taxes, cap-and-trade programs, tax incentives, are able to reduce pollution in a cost-effective manner, provide flexibility to industry and stimulate innovation in cleaner technologies. In the past few decades, both federal and state governments have shown increased use of economic incentives in environmental policy. Some states have embraced them in an active manner, while others have failed to do so. This research uses a three-step analysis. First, it asks why some states employ more economic incentives than others to stimulate consumption of renewable energy by the residential, commercial and industrial sectors. Second, it asks why some states employ stronger incentives than others. And third, it asks why certain states employ certain instruments, such as electricity surcharges, cap-and-trade programs, tax incentives or grants, while others do not. The first two analyses were conducted using factor analysis and multiple regression analysis, while the third analysis employed logistic regression models to analyze the data. Data for all three analyses were obtained from a combination of primary and secondary sources. To address these questions, a theory of instrument choice at the state level, which includes both internal and external determinants of policy-making, was developed and tested. The state level of analysis was chosen. States have proven to be pioneers in designing policies to address greenhouse gases (see, for instance, the recent cap-and-trade legislation passed in California). The theory was operationalized with the help of four models: needs/responsiveness, interest group influence, professionalism/capacity and innovation-and-diffusion. The needs/responsiveness model suggests that states tend to choose more and stronger economic

  15. Practical aspects of trapped ion mass spectrometry, 4 theory and instrumentation

    CERN Document Server

    March, Raymond E

    2010-01-01

    The expansion of the use of ion trapping in different areas of mass spectrometry and different areas of application indicates the value of a single source of information drawing together diverse inputs. This book provides an account of the theory and instrumentation of mass spectrometric applications and an introduction to ion trapping devices.

  16. Modeling of Zakat in the capital structure theory | Sanusi | Journal of ...

    African Journals Online (AJOL)

    Islamic financial instruments are subject to taxes and zakat for Muslim shareholders and debt holders. Therefore, it is important to investigate the implementation of corporate taxes and corporate zakat in capital structure compositions. In order to model corporate zakat in terms of conventional capital structure theories, this ...

  17. Realism, instrumentalism, and scientific symbiosis: psychological theory as a search for truth and the discovery of solutions.

    Science.gov (United States)

    Cacioppo, John T; Semin, Gün R; Berntson, Gary G

    2004-01-01

    Scientific realism holds that scientific theories are approximations of universal truths about reality, whereas scientific instrumentalism posits that scientific theories are intellectual structures that provide adequate predictions of what is observed and useful frameworks for answering questions and solving problems in a given domain. These philosophical perspectives have different strengths and weaknesses and have been regarded as incommensurate: Scientific realism fosters theoretical rigor, verifiability, parsimony, and debate, whereas scientific instrumentalism fosters theoretical innovation, synthesis, generativeness, and scope. The authors review the evolution of scientific realism and instrumentalism in psychology and propose that the categorical distinction between the 2 is overstated as a prescription for scientific practice. The authors propose that the iterative deployment of these 2 perspectives, just as the iterative application of inductive and deductive reasoning in science, may promote more rigorous, integrative, cumulative, and useful scientific theories.

  18. Evaluation of multivariate calibration models transferred between spectroscopic instruments

    DEFF Research Database (Denmark)

    Eskildsen, Carl Emil Aae; Hansen, Per W.; Skov, Thomas

    2016-01-01

    In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions for the ......In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions...... for the same samples using the transferred model. However, sometimes the success of a model transfer is evaluated by comparing the transferred model predictions with the reference values. This is not optimal, as uncertainties in the reference method will impact the evaluation. This paper proposes a new method...... for calibration model transfer evaluation. The new method is based on comparing predictions from different instruments, rather than comparing predictions and reference values. A total of 75 flour samples were available for the study. All samples were measured on ten near infrared (NIR) instruments from two...

  19. The Standard, Power, and Color Model of Instrument Combination in Romantic-Era Symphonic Works

    Directory of Open Access Journals (Sweden)

    Randolph Johnson

    2011-08-01

    Full Text Available The Standard, Power, and Color (SPC model describes the nexus between musical instrument combination patterns and expressive goals in music. Instruments within each SPC group tend to attract each other and work as a functional unit to create orchestral gestures. Standard instruments establish a timbral groundwork; Power instruments create contrast through loud dynamic climaxes; and Color instruments catch listeners’ attention by means of their sparing use. Examples within these three groups include violin (Standard, piccolo (Power, and harp (Color. The SPC theory emerges from analyses of nineteenth-century symphonic works. Multidimensional scaling analysis of instrument combination frequencies maps instrument relationships; hierarchical clustering analysis indicates three SPC groups within the map. The SPC characterization is found to be moderately robust through the results of hypothesis testing: (1 Color instruments are included less often in symphonic works; (2 when Color instruments are included, they perform less often than the average instrument; and (3 Color and non-Color instruments have equal numbers of solo occurrences. Additionally, (4 Power instruments are positively associated with louder dynamic levels; and (5 when Power instruments are present in the musical texture, the pitch range spanned by the entire orchestra does not become more extreme.

  20. Development of a simple 12-item theory-based instrument to assess the impact of continuing professional development on clinical behavioral intentions.

    Directory of Open Access Journals (Sweden)

    France Légaré

    Full Text Available Decision-makers in organizations providing continuing professional development (CPD have identified the need for routine assessment of its impact on practice. We sought to develop a theory-based instrument for evaluating the impact of CPD activities on health professionals' clinical behavioral intentions.Our multipronged study had four phases. 1 We systematically reviewed the literature for instruments that used socio-cognitive theories to assess healthcare professionals' clinically-oriented behavioral intentions and/or behaviors; we extracted items relating to the theoretical constructs of an integrated model of healthcare professionals' behaviors and removed duplicates. 2 A committee of researchers and CPD decision-makers selected a pool of items relevant to CPD. 3 An international group of experts (n = 70 reached consensus on the most relevant items using electronic Delphi surveys. 4 We created a preliminary instrument with the items found most relevant and assessed its factorial validity, internal consistency and reliability (weighted kappa over a two-week period among 138 physicians attending a CPD activity. Out of 72 potentially relevant instruments, 47 were analyzed. Of the 1218 items extracted from these, 16% were discarded as improperly phrased and 70% discarded as duplicates. Mapping the remaining items onto the constructs of the integrated model of healthcare professionals' behaviors yielded a minimum of 18 and a maximum of 275 items per construct. The partnership committee retained 61 items covering all seven constructs. Two iterations of the Delphi process produced consensus on a provisional 40-item questionnaire. Exploratory factorial analysis following test-retest resulted in a 12-item questionnaire. Cronbach's coefficients for the constructs varied from 0.77 to 0.85.A 12-item theory-based instrument for assessing the impact of CPD activities on health professionals' clinical behavioral intentions showed adequate validity and

  1. Realism, Instrumentalism, and Scientific Symbiosis: Psychological Theory as a search for truth and the discovery of solutions

    NARCIS (Netherlands)

    Cacioppo, J.T.; Semin, G.R.; Berntson, G.G.

    2004-01-01

    Scientific realism holds that scientific theories are approximations of universal truths about reality, whereas scientific instrumentalism posits that scientific theories are intellectual structures that provide adequate predictions of what is observed and useful frameworks for answering questions

  2. Towards a Transcultural Theory of Democracy for Instrumental Music Education

    Science.gov (United States)

    Tan, Leonard

    2014-01-01

    At present, instrumental music education, defined in this paper as the teaching and learning of music through wind bands and symphony orchestras of Western origin, appears embattled. Among the many criticisms made against instrumental music education, critics claim that bands and orchestras exemplify an authoritarian model of teaching that does…

  3. Application of a model of instrumental conditioning to mobile robot control

    Science.gov (United States)

    Saksida, Lisa M.; Touretzky, D. S.

    1997-09-01

    Instrumental conditioning is a psychological process whereby an animal learns to associate its actions with their consequences. This type of learning is exploited in animal training techniques such as 'shaping by successive approximations,' which enables trainers to gradually adjust the animal's behavior by giving strategically timed reinforcements. While this is similar in principle to reinforcement learning, the real phenomenon includes many subtle effects not considered in the machine learning literature. In addition, a good deal of domain information is utilized by an animal learning a new task; it does not start from scratch every time it learns a new behavior. For these reasons, it is not surprising that mobile robot learning algorithms have yet to approach the sophistication and robustness of animal learning. A serious attempt to model instrumental learning could prove fruitful for improving machine learning techniques. In the present paper, we develop a computational theory of shaping at a level appropriate for controlling mobile robots. The theory is based on a series of mechanisms for 'behavior editing,' in which pre-existing behaviors, either innate or previously learned, can be dramatically changed in magnitude, shifted in direction, or otherwise manipulated so as to produce new behavioral routines. We have implemented our theory on Amelia, an RWI B21 mobile robot equipped with a gripper and color video camera. We provide results from training Amelia on several tasks, all of which were constructed as variations of one innate behavior, object-pursuit.

  4. Out- and insourcing, an analysis model for use of instrumented techniques

    DEFF Research Database (Denmark)

    Bang, Henrik Peter; Grønbæk, Niels; Larsen, Claus Richard

    2017-01-01

    We sketch an outline of a model for analyzing the use of ICT-tools, in particular CAS, in teaching designs employed by ‘generic’ teachers. Our model uses the business economics concepts out- and insourcing as metaphors within the dialectics of tool and content in planning of teaching. Outsourcing...... is done in order to enhance outcome through external partners. The converse concept of insourcing refers to internal sourcing. We shall adhere to the framework of the anthropological theory of the didactic, viewing out- and insourcing primarily as decisions about the technology component of praxeologies....... We use the model on a concrete example from Danish upper secondary mathematics to uncover what underlies teachers’ decisions (deliberate or colloquial) on incorporating instrumented approaches....

  5. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  6. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  7. From theory based policy evaluation to SMART Policy Design: Lessons learned from 20 ex-post evaluations of energy efficiency instruments

    International Nuclear Information System (INIS)

    Harmelink, Mirjam; Harmsen, Robert; Nilsson, Lars

    2007-01-01

    This article presents the results of an in-depth ex-post analysis of 20 energy efficiency policy instruments applied across different sectors and countries. Within the AID-EE project, we reconstructed and analysed the implementation process of energy efficiency policy instruments with the aim to identify key factors behind successes and failures. The analysis was performed using a uniform methodology called 'theory based policy evaluation'. With this method the whole implementation process is assessed with the aim to identify: (i) the main hurdles in each step of the implementation process, (ii) key success factors for different types of instruments and (iii) the key indicators that need to be monitored to enable a sound evaluation of the energy efficiency instruments. Our analysis shows that: Energy efficiency policies often lack quantitative targets and clear timeframes; Often policy instruments have multiple and/or unclear objectives; The need for monitoring information does often not have priority in the design phase; For most instruments, monitoring information is collected on a regular basis. However, this information is often insufficient to determine the impact on energy saving, cost-effectiveness and target achievement of an instrument; Monitoring and verification of actual energy savings have a relatively low priority for most of the analyzed instruments. There is no such thing as the 'best' policy instrument. However, typical circumstances in which to apply different types of instruments and generic characteristics that determine success or failure can be identified. Based on the assessments and the experience from applying theory based policy evaluation ex-post, we suggest that this should already be used in the policy formulation and design phase of instruments. We conclude that making policy theory an integral and mandated part of the policy process would facilitate more efficient and effective energy efficiency instruments

  8. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  9. Theory, Instrumentation and Applications of Magnetoelastic Resonance Sensors: A Review

    Science.gov (United States)

    Grimes, Craig A.; Roy, Somnath C.; Rani, Sanju; Cai, Qingyun

    2011-01-01

    Thick-film magnetoelastic sensors vibrate mechanically in response to a time varying magnetic excitation field. The mechanical vibrations of the magnetostrictive magnetoelastic material launch, in turn, a magnetic field by which the sensor can be monitored. Magnetic field telemetry enables contact-less, remote-query operation that has enabled many practical uses of the sensor platform. This paper builds upon a review paper we published in Sensors in 2002 (Grimes, C.A.; et al. Sensors 2002, 2, 294–313), presenting a comprehensive review on the theory, operating principles, instrumentation and key applications of magnetoelastic sensing technology. PMID:22163768

  10. Theory, Instrumentation and Applications of Magnetoelastic Resonance Sensors: A Review

    Directory of Open Access Journals (Sweden)

    Craig A. Grimes

    2011-03-01

    Full Text Available Thick-film magnetoelastic sensors vibrate mechanically in response to a time varying magnetic excitation field. The mechanical vibrations of the magnetostrictive magnetoelastic material launch, in turn, a magnetic field by which the sensor can be monitored. Magnetic field telemetry enables contact-less, remote-query operation that has enabled many practical uses of the sensor platform. This paper builds upon a review paper we published in Sensors in 2002 (Grimes, C.A.; et al. Sensors 2002, 2, 294-313, presenting a comprehensive review on the theory, operating principles, instrumentation and key applications of magnetoelastic sensing technology.

  11. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  12. The SMART Theory and Modeling Team: An Integrated Element of Mission Development and Science Analysis

    Science.gov (United States)

    Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.

    2005-01-01

    When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored

  13. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz

    2017-01-01

    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  14. A multi-species exchange model for fully fluctuating polymer field theory simulations.

    Science.gov (United States)

    Düchs, Dominik; Delaney, Kris T; Fredrickson, Glenn H

    2014-11-07

    Field-theoretic models have been used extensively to study the phase behavior of inhomogeneous polymer melts and solutions, both in self-consistent mean-field calculations and in numerical simulations of the full theory capturing composition fluctuations. The models commonly used can be grouped into two categories, namely, species models and exchange models. Species models involve integrations of functionals that explicitly depend on fields originating both from species density operators and their conjugate chemical potential fields. In contrast, exchange models retain only linear combinations of the chemical potential fields. In the two-component case, development of exchange models has been instrumental in enabling stable complex Langevin (CL) simulations of the full complex-valued theory. No comparable stable CL approach has yet been established for field theories of the species type. Here, we introduce an extension of the exchange model to an arbitrary number of components, namely, the multi-species exchange (MSE) model, which greatly expands the classes of soft material systems that can be accessed by the complex Langevin simulation technique. We demonstrate the stability and accuracy of the MSE-CL sampling approach using numerical simulations of triblock and tetrablock terpolymer melts, and tetrablock quaterpolymer melts. This method should enable studies of a wide range of fluctuation phenomena in multiblock/multi-species polymer blends and composites.

  15. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers

    Directory of Open Access Journals (Sweden)

    Paul Branscum

    2016-06-01

    Full Text Available Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study’s purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child’s consumption of fruits and vegetables and sugar-sweetened beverages (SSB. Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB. Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis, and internal consistency reliability (Cronbach’s alpha. Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers’ consumption of fruits and vegetables, and SSB.

  16. Expectancy Theory Modeling

    Science.gov (United States)

    1982-08-01

    accomplish the task, (2) the instrumentality of task performance for job outcomes, and (3) the instrumentality of outcomes for need satisfaction . We...in this discussion: effort, performance , outcomes, and needs. In order to present briefly the conventional approach to the Vroom models, another...Presumably, this is the final event in the sequence of effort, performance , outcome, and need satisfaction . The actual research reported in expectancy

  17. Confucian "Creatio in Situ"--Philosophical Resource for a Theory of Creativity in Instrumental Music Education

    Science.gov (United States)

    Tan, Leonard

    2016-01-01

    In this philosophical essay, I propose a theory of creativity for instrumental music education inspired by Confucian "creatio in situ" ("situational creativity"). Through an analysis of three major texts from classical Confucianism--the "Analects," the "Zhongyong" ("Doctrine of the Mean"), and the…

  18. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  19. Instrumentation of a prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Hessheimer, M.F.; Rightley, M.J.; Matsumoto, T.

    1995-01-01

    A series of static overpressurization tests of scale models of nuclear containment structures is being conducted by Sandia National Laboratories for the Nuclear Power Engineering Corporation of Japan and the U.S. Nuclear Regulatory Commission. At present, two tests are being planned: a test of a model of a steel containment vessel (SCV) that is representative of an improved, boiling water reactor (BWR) Mark II design; and a test of a model of a prestressed concrete containment vessel (PCCV). This paper discusses plans and the results of a preliminary investigation of the instrumentation of the PCCV model. The instrumentation suite for this model will consist of approximately 2000 channels of data to record displacements, strains in the reinforcing steel, prestressing tendons, concrete, steel liner and liner anchors, as well as pressure and temperature. The instrumentation is being designed to monitor the response of the model during prestressing operations, during Structural Integrity and Integrated Leak Rate testing, and during test to failure of the model. Particular emphasis has been placed on instrumentation of the prestressing system in order to understand the behavior of the prestressing strands at design and beyond design pressure levels. Current plans are to place load cells at both ends of one third of the tendons in addition to placing strain measurement devices along the length of selected tendons. Strain measurements will be made using conventional bonded foil resistance gages and a wire resistance gage, known as a open-quotes Tensmegclose quotes reg-sign gage, specifically designed for use with seven-wire strand. The results of preliminary tests of both types of gages, in the laboratory and in a simulated model configuration, are reported and plans for instrumentation of the model are discussed

  20. Creating and purifying an observation instrument using the generalizability theory

    Directory of Open Access Journals (Sweden)

    Elena Rodríguez-Naveiras

    2013-12-01

    Full Text Available The control of quality of data it is one of the most relevant aspects in observational researches. The Generalizability Theory (GT provides a method of analysis that allows us to isolate the various sources of error measurement. At the same time, it helps us to determine the extent to which various factors can change and analyze the effect on the generalizability coefficient. In the work shown here, there are two studies aimed to creating and purifying an observation instrument, Observation Protocol in the Teaching Functions (Protocolo de Funciones Docentes, PROFUNDO, v1 and v2, for behavioral assessment which has been carried out by instructors in a social-affective out-of-school program. The reliability and homogeneity studies are carried out once the instrument has been created and purified. The reliability study will be done through the GT method taking both codes (c and agents (a as differential facets in. The generalization will be done through observers using a crossed multi-faceted design (A × O × C. In the homogeneity study the generalization facet will be done through codes using the same design that the reliability study.

  1. Sound production in recorder-like instruments : II. a simulation model

    NARCIS (Netherlands)

    Verge, M.P.; Hirschberg, A.; Causse, R.

    1997-01-01

    A simple one-dimensional representation of recorderlike instruments, that can be used for sound synthesis by physical modeling of flutelike instruments, is presented. This model combines the effects on the sound production by the instrument of the jet oscillations, vortex shedding at the edge of the

  2. High School Instrumental Music Students' Attitudes and Beliefs regarding Practice: An Application of Attribution Theory

    Science.gov (United States)

    Schatt, Matthew D.

    2011-01-01

    The purpose of this study was to explore high school band students' perspectives of instrumental music practice from within the attribution theory paradigm and to attempt to elucidate the secondary student's attitudes toward practice. High school band students from three Midwestern school districts (N = 218) completed a survey that was used to…

  3. Latest NASA Instrument Cost Model (NICM): Version VI

    Science.gov (United States)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  4. PRAGMATIST MODEL OF DECISION-MAKING IN LAW: FROM THE INSTRUMENTAL MENTALISM TO THE COMMUNICATIVE INTERSUBJECTIVITY

    Directory of Open Access Journals (Sweden)

    Mário Cesar da Silva Andrade

    2015-12-01

    Full Text Available This paper aimed to evaluate the method of making rational decision derived from the philosophy of Kant as a foundation paradigma of public decisions and, more specifically, of legal decisions. Based on the communicative action theory of Jürgen Habermas, the question is  if  the  transcendental  model  of  decision-making  meets  the  democratic  demands. Methodologically, the qualitative research was based on doctrinal sources about the theme, promoting a legal and critical analysis. Habermas' communicative bias raises the hypothesis that Kant's transcendental method, which influenced so much the theory of justice and Law, entails the adoption of an objective posture by the decision maker, something incompatible with the need for broad participation and the intersubjectivity prescribed by democracy . It was concluded that the public decision-making process must overcome the transcendental, decisionistic  and  instrumental  models,  adopting  pragmatic  model,  which  is  more intersubjective and communicative, therefore more consistente with the participatory bias of democracy.

  5. Item response theory and structural equation modelling for ordinal data: Describing the relationship between KIDSCREEN and Life-H.

    Science.gov (United States)

    Titman, Andrew C; Lancaster, Gillian A; Colver, Allan F

    2016-10-01

    Both item response theory and structural equation models are useful in the analysis of ordered categorical responses from health assessment questionnaires. We highlight the advantages and disadvantages of the item response theory and structural equation modelling approaches to modelling ordinal data, from within a community health setting. Using data from the SPARCLE project focussing on children with cerebral palsy, this paper investigates the relationship between two ordinal rating scales, the KIDSCREEN, which measures quality-of-life, and Life-H, which measures participation. Practical issues relating to fitting models, such as non-positive definite observed or fitted correlation matrices, and approaches to assessing model fit are discussed. item response theory models allow properties such as the conditional independence of particular domains of a measurement instrument to be assessed. When, as with the SPARCLE data, the latent traits are multidimensional, structural equation models generally provide a much more convenient modelling framework. © The Author(s) 2013.

  6. The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories

    Directory of Open Access Journals (Sweden)

    Robert Logozar

    2011-12-01

    Full Text Available We present the modeling of dynamical systems and finding of their complexity indicators by the use of concepts from computation and information theories, within the framework of J. P. Crutchfield's theory of  ε-machines. A short formal outline of the  ε-machines is given. In this approach, dynamical systems are analyzed directly from the time series that is received from a properly adjusted measuring instrument. The binary strings are parsed through the parse tree, within which morphologically and probabilistically unique subtrees or morphs are recognized as system states. The outline and precise interrelation of the information-theoretic entropies and complexities emanating from the model is given. The paper serves also as a theoretical foundation for the future presentation of the DSA program that implements the  ε-machines modeling up to the stochastic finite automata level.

  7. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  8. Developing a workplace resilience instrument.

    Science.gov (United States)

    Mallak, Larry A; Yildiz, Mustafa

    2016-05-27

    Resilience benefits from the use of protective factors, as opposed to risk factors, which are associated with vulnerability. Considerable research and instrument development has been conducted in clinical settings for patients. The need existed for an instrument to be developed in a workplace setting to measure resilience of employees. This study developed and tested a resilience instrument for employees in the workplace. The research instrument was distributed to executives and nurses working in the United States in hospital settings. Five-hundred-forty completed and usable responses were obtained. The instrument contained an inventory of workplace resilience, a job stress questionnaire, and relevant demographics. The resilience items were written based on previous work by the lead author and inspired by Weick's [1] sense-making theory. A four-factor model yielded an instrument having psychometric properties showing good model fit. Twenty items were retained for the resulting Workplace Resilience Instrument (WRI). Parallel analysis was conducted with successive iterations of exploratory and confirmatory factor analyses. Respondents were classified based on their employment with either a rural or an urban hospital. Executives had significantly higher WRI scores than nurses, controlling for gender. WRI scores were positively and significantly correlated with years of experience and the Brief Job Stress Questionnaire. An instrument to measure individual resilience in the workplace (WRI) was developed. The WRI's four factors identify dimensions of workplace resilience for use in subsequent investigations: Active Problem-Solving, Team Efficacy, Confident Sense-Making, and Bricolage.

  9. Validation of an Instrument for Assessing Conceptual Change with Respect to the Theory of Evolution by Secondary Biology Students

    Science.gov (United States)

    Goff, Kevin David

    This pilot study evaluated the validity of a new quantitative, closed-response instrument for assessing student conceptual change regarding the theory of evolution. The instrument has two distinguishing design features. First, it is designed not only to gauge student mastery of the scientific model of evolution, but also to elicit a trio of deeply intuitive tendencies that are known to compromise many students' understanding: the projection of intentional agency, teleological directionality, and immutable essences onto biological phenomena. Second, in addition to a section of conventional multiple choice questions, the instrument contains a series of items where students may simultaneously endorse both scientifically normative propositions and intuitively appealing yet unscientific propositions, without having to choose between them. These features allow for the hypothesized possibility that the three intuitions are partly innate, themselves products of cognitive evolution in our hominin ancestors, and thus may continue to inform students' thinking even after instruction and conceptual change. The test was piloted with 340 high school students from diverse schools and communities. Confirmatory factor analysis and other statistical methods provided evidence that the instrument already has strong potential for validly distinguishing students who hold a correct scientific understanding from those who do not, but that revision and retesting are needed to render it valid for gauging students' adherence to intuitive misconceptions. Ultimately the instrument holds promise as a tool for classroom intervention studies by conceptual change researchers, for diagnostic testing and data gathering by instructional leaders, and for provoking classroom dialogue and debate by science teachers.

  10. Field theory and the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Dudas, E [Orsay, LPT (France)

    2014-07-01

    This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.

  11. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  12. Lattice models and conformal field theories

    International Nuclear Information System (INIS)

    Saleur, H.

    1988-01-01

    Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied

  13. Flute-like musical instruments: A toy model investigated through numerical continuation

    Science.gov (United States)

    Terrien, Soizic; Vergez, Christophe; Fabre, Benoît

    2013-07-01

    Self-sustained musical instruments (bowed string, woodwind and brass instruments) can be modelled by nonlinear lumped dynamical systems. Among these instruments, flutes and flue organ pipes present the particularity to be modelled as a delay dynamical system. In this paper, such a system, a toy model of flute-like instruments, is studied using numerical continuation. Equilibrium and periodic solutions are explored with respect to the blowing pressure, with focus on amplitude and frequency evolutions along the different solution branches, as well as "jumps" between periodic solution branches. The influence of a second model parameter (namely the inharmonicity) on the behaviour of the system is addressed. It is shown that harmonicity plays a key role in the presence of hysteresis or quasiperiodic regime. Throughout the paper, experimental results on a real instrument are presented to illustrate various phenomena, and allow some qualitative comparisons with numerical results.

  14. Efficacy of an extended theory of planned behaviour model for predicting caterers' hand hygiene practices.

    Science.gov (United States)

    Clayton, Deborah A; Griffith, Christopher J

    2008-04-01

    The main aim of this study was to determine the factors which influence caterers' hand hygiene practices using social cognitive theory. One hundred and fifteen food handlers from 29 catering businesses were observed carrying out 31,050 food preparation actions in their workplace. Caterers subsequently completed the Hand Hygiene Instrument (HHI), which ascertained attitudes towards hand hygiene using constructs from the Theory of Planned Behaviour (TPB) and the Health Belief Model. The TPB provided a useful framework for understanding caterers' implementation of hand hygiene practices, explaining 34% of the variance in hand hygiene malpractices (p behavioural control and intention (p food safety culture.

  15. Gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Witten, E.

    1989-01-01

    Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)

  16. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2001-01-01

    , in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...

  17. Frequency-Zooming ARMA Modeling for Analysis of Noisy String Instrument Tones

    Directory of Open Access Journals (Sweden)

    Paulo A. A. Esquef

    2003-09-01

    Full Text Available This paper addresses model-based analysis of string instrument sounds. In particular, it reviews the application of autoregressive (AR modeling to sound analysis/synthesis purposes. Moreover, a frequency-zooming autoregressive moving average (FZ-ARMA modeling scheme is described. The performance of the FZ-ARMA method on modeling the modal behavior of isolated groups of resonance frequencies is evaluated for both synthetic and real string instrument tones immersed in background noise. We demonstrate that the FZ-ARMA modeling is a robust tool to estimate the decay time and frequency of partials of noisy tones. Finally, we discuss the use of the method in synthesis of string instrument sounds.

  18. Quiver gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Yagi, Junya

    2015-01-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  19. Instrument Modeling and Synthesis

    Science.gov (United States)

    Horner, Andrew B.; Beauchamp, James W.

    During the 1970s and 1980s, before synthesizers based on direct sampling of musical sounds became popular, replicating musical instruments using frequency modulation (FM) or wavetable synthesis was one of the “holy grails” of music synthesis. Synthesizers such as the Yamaha DX7 allowed users great flexibility in mixing and matching sounds, but were notoriously difficult to coerce into producing sounds like those of a given instrument. Instrument design wizards practiced the mysteries of FM instrument design.

  20. Model SH intelligent instrument for thickness measuring

    International Nuclear Information System (INIS)

    Liu Juntao; Jia Weizhuang; Zhao Yunlong

    1995-01-01

    The authors introduce Model SH Intelligent Instrument for thickness measuring by using principle of beta back-scattering and its application range, features, principle of operation, system design, calibration and specifications

  1. Economic Modelling in Institutional Economic Theory

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski

    2017-06-01

    Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.

  2. An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory.

    Science.gov (United States)

    Komashie, Alexander; Mousavi, Ali; Clarkson, P John; Young, Terry

    2015-01-01

    This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation.

  3. Warped models in string theory

    International Nuclear Information System (INIS)

    Acharya, B.S.; Benini, F.; Valandro, R.

    2006-12-01

    Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)

  4. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  5. On low rank classical groups in string theory, gauge theory and matrix models

    International Nuclear Information System (INIS)

    Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun

    2004-01-01

    We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature

  6. Theories of Motivation--Borrowing the Best.

    Science.gov (United States)

    Terpstra, David E.

    1979-01-01

    Five theories of motivation are discussed: Maslow's Need Hierarchy, Herzberg's dual-factor or motivation-hygiene theory, goal setting or task motivation, expectancy/valence-theory (also known as instrumentality theory, valence-instrumentality-expectancy theory, or expectancy theory), and reinforcement. (JH)

  7. A Theory-Based Model for Understanding Faculty Intention to Use Students Ratings to Improve Teaching in a Health Sciences Institution in Puerto Rico

    Science.gov (United States)

    Collazo, Andrés A.

    2018-01-01

    A model derived from the theory of planned behavior was empirically assessed for understanding faculty intention to use student ratings for teaching improvement. A sample of 175 professors participated in the study. The model was statistically significant and had a very large explanatory power. Instrumental attitude, affective attitude, perceived…

  8. NASA Instrument Cost/Schedule Model

    Science.gov (United States)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  9. Principle And Practice Of Instrumental Analysis

    International Nuclear Information System (INIS)

    Kim, Chong Il; Kim, Chang Gyo; Oh, Yong Taek

    2010-03-01

    This book gives descriptions of instrumental analysis, which includes quantum physics and chemical foundation with Bohr's adatom model and structure of an electron of atom and the periodic table, analytical theory on analytical chemistry and chemometrics, measurement uncertainty and probability and statistics, state of crystalline materials, structure of crystalline materials, spectroscopy, like infrared spectroscopy and near infrared spectroscopy, electro spectroscopy, mass spectroscopy and Atomic spectroscopy.

  10. Telaah Kritis Expectancy Theory Victor Harold Vroom

    OpenAIRE

    Anatan, Lina

    2010-01-01

    Expectancy theory has emerge as the dominant process theory of motivation, originally developed by Vroom is a theory explaining the process individual use to make decision on a various behavioral alternatives. The motivational force for a behavior, action, or task is a function of three distinctive perceptions: expectancy, instrumentality, and valance. The theory contains of two models: for the precition of valance of an outcome, and the other for the prediction of force toward behavior. It p...

  11. Derivative instruments a guide to theory and practice

    CERN Document Server

    Eales, Brian

    2003-01-01

    The authors concentrate on the practicalities of each class of derivative, so that readers can apply the techniques in practice. Product descriptions are supported by detailed spreadsheet models, illustrating the techniques employed, some which are available on the accompanying companion website. This book is ideal reading for derivatives traders, salespersons, financial engineers, risk managers, and other professionals involved to any extent in the application and analysis of OTC derivatives.* Combines theory with valuation to provide overall coverage of the topic area* Pr

  12. Modeling students' instrumental (mis-) use of substances to enhance cognitive performance: Neuroenhancement in the light of job demands-resources theory.

    Science.gov (United States)

    Wolff, Wanja; Brand, Ralf; Baumgarten, Franz; Lösel, Johanna; Ziegler, Matthias

    2014-01-01

    Healthy university students have been shown to use psychoactive substances, expecting them to be functional means for enhancing their cognitive capacity, sometimes over and above an essentially proficient level. This behavior called Neuroenhancement (NE) has not yet been integrated into a behavioral theory that is able to predict performance. Job Demands Resources (JD-R) Theory for example assumes that strain (e.g. burnout) will occur and influence performance when job demands are high and job resources are limited at the same time. The aim of this study is to investigate whether or not university students' self-reported NE can be integrated into JD-R Theory's comprehensive approach to psychological health and performance. 1,007 students (23.56 ± 3.83 years old, 637 female) participated in an online survey. Lifestyle drug, prescription drug, and illicit substance NE together with the complete set of JD-R variables (demands, burnout, resources, motivation, and performance) were measured. Path models were used in order to test our data's fit to hypothesized main effects and interactions. JD-R Theory could successfully be applied to describe the situation of university students. NE was mainly associated with the JD-R Theory's health impairment process: Lifestyle drug NE (p performance. From a public health perspective, intervention strategies should address these costs of non-supervised NE. With regard to future research we propose to model NE as a means to reach an end (i.e. performance enhancement) rather than a target behavior itself. This is necessary to provide a deeper understanding of the behavioral roots and consequences of the phenomenon.

  13. A course on basic model theory

    CERN Document Server

    Sarbadhikari, Haimanti

    2017-01-01

    This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.

  14. Kolb's Experiential Learning Model: Critique from a Modeling Perspective

    Science.gov (United States)

    Bergsteiner, Harald; Avery, Gayle C.; Neumann, Ruth

    2010-01-01

    Kolb's experiential learning theory has been widely influential in adult learning. The theory and associated instruments continue to be criticized, but rarely is the graphical model itself examined. This is significant because models can aid scientific understanding and progress, as well as theory development and research. Applying accepted…

  15. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  16. Evaluating Higher Education Institutions through Agency and Resources-Capabilities Theories. A Model for Measuring the Perceived Quality of Service

    Directory of Open Access Journals (Sweden)

    José Guadalupe Vargas-hernández

    2016-08-01

    Full Text Available The objective of this paper is to explain through the agency theory and theory of resources and capacities as is the process of assessment in higher education institutions. The actors that are involved in the decision-making and the use that is giving the resources derived from repeatedly to practices that opportunistic diminishing the value that is given to the evaluation, in addition to the decrease in team work. A model is presented to measure the perception of service quality by students of the Technological Institute of Celaya, as part of the system of quality control, based on the theoretical support of several authors who have developed this topic (SERVQUAL and SERPERF an instrument adapted to the student area of the institution called SERQUALITC is generated. The paper presents the areas or departments to assess and the convenient size, the number of items used by size and Likert scale, the validation study instrument is mentioned. Finally, it is presented the model that poses a global vision of quality measurement process including corrective action services that enable continuous improvement.

  17. Evaluating Higher Education Institutions through Agency and Resources-Capabilities Theories. A Model for Measuring the Perceived Quality of Service

    Directory of Open Access Journals (Sweden)

    José G. Vargas-Hernández

    2016-12-01

    Full Text Available The objective of this paper is to explain through the agency theory and theory of resources and capacities as is the process of assessment in higher education institutions. The actors that are involved in the decision-making and the use that is giving the resources derived from repeatedly to practices that opportunistic diminishing the value that is given to the evaluation, in addition to the decrease in team work. A model is presented to measure the perception of service quality by students of the Technological Institute of Celaya, as part of the system of quality control, based on the theoretical support of several authors who have developed this topic (SERVQUAL and SERPERF an instrument adapted to the student area of the institution called SERQUALITC is generated. The paper presents the areas or departments to assess and the convenient size, the number of items used by size and Likert scale, the validation study instrument is mentioned. Finally, it is presented the model that poses a global vision of quality measurement process including corrective action services that enable continuous improvement.

  18. Development and validation of the coronary heart disease scale under the system of quality of life instruments for chronic diseases QLICD-CHD: combinations of classical test theory and Generalizability Theory.

    Science.gov (United States)

    Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong

    2014-06-04

    Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be

  19. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  20. Constraint theory multidimensional mathematical model management

    CERN Document Server

    Friedman, George J

    2017-01-01

    Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...

  1. Quantum Steering Beyond Instrumental Causal Networks

    Science.gov (United States)

    Nery, R. V.; Taddei, M. M.; Chaves, R.; Aolita, L.

    2018-04-01

    We theoretically predict, and experimentally verify with entangled photons, that outcome communication is not enough for hidden-state models to reproduce quantum steering. Hidden-state models with outcome communication correspond, in turn, to the well-known instrumental processes of causal inference but in the one-sided device-independent scenario of one black-box measurement device and one well-characterized quantum apparatus. We introduce one-sided device-independent instrumental inequalities to test against these models, with the appealing feature of detecting entanglement even when communication of the black box's measurement outcome is allowed. We find that, remarkably, these inequalities can also be violated solely with steering, i.e., without outcome communication. In fact, an efficiently computable formal quantifier—the robustness of noninstrumentality—naturally arises, and we prove that steering alone is enough to maximize it. Our findings imply that quantum theory admits a stronger form of steering than known until now, with fundamental as well as practical potential implications.

  2. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  3. Neutron Star Models in Alternative Theories of Gravity

    Science.gov (United States)

    Manolidis, Dimitrios

    We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.

  4. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  5. Instrument evaluation no. 11. ESI nuclear model 271 C contamination monitor

    International Nuclear Information System (INIS)

    Burgess, P.H.; Iles, W.J.

    1978-06-01

    The various radiations encountered in radiological protection cover a wide range of energies and radiation measurements have to he carried out under an equally broad spectrum of environmental conditions. This report is one of a series intended to give information on the performance characteristics of radiological protection instruments, to assist in the selection of appropriate instruments for a given purpose, to interpret the results obtained with such instruments, and, in particular, to know the likely sources and magnitude of errors that might be associated with measurements in the field. The radiation, electrical and environmental characteristics of radiation protection instruments are considered together with those aspects of the construction which make an instrument convenient for routine use. To provide consistent criteria for instrument performance, the range of tests performed on any particular class of instrument, the test methods and the criteria of acceptable performance are based broadly on the appropriate Recommendations of the International Electrotechnical Commission. The radiations in the tests are, in general, selected from the range of reference radiations for instrument calibration being drawn up by the International Standards Organisation. Normally, each report deals with the capabilities and limitations of one model of instrument and no direct comparison with other instruments intended for similar purposes is made, since the significance of particular performance characteristics largely depends on the radiations and environmental conditions in which the instrument is to be used. The results quoted here have all been obtained from tests on instruments in routine production, with the appropriate measurements being made by the NRPB. This report deals with the ESI Nuclear Model 271 C; a general purpose contamination monitor, comprising a GM tube connected by a coiled extensible cable to a ratemeter

  6. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-15

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  7. Instrumental analysis

    International Nuclear Information System (INIS)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-01

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  8. New earth system model for optical performance evaluation of space instruments.

    Science.gov (United States)

    Ryu, Dongok; Kim, Sug-Whan; Breault, Robert P

    2017-03-06

    In this study, a new global earth system model is introduced for evaluating the optical performance of space instruments. Simultaneous imaging and spectroscopic results are provided using this global earth system model with fully resolved spatial, spectral, and temporal coverage of sub-models of the Earth. The sun sub-model is a Lambertian scattering sphere with a 6-h scale and 295 lines of solar spectral irradiance. The atmospheric sub-model has a 15-layer three-dimensional (3D) ellipsoid structure. The land sub-model uses spectral bidirectional reflectance distribution functions (BRDF) defined by a semi-empirical parametric kernel model. The ocean is modeled with the ocean spectral albedo after subtracting the total integrated scattering of the sun-glint scatter model. A hypothetical two-mirror Cassegrain telescope with a 300-mm-diameter aperture and 21.504 mm × 21.504-mm focal plane imaging instrument is designed. The simulated image results are compared with observational data from HRI-VIS measurements during the EPOXI mission for approximately 24 h from UTC Mar. 18, 2008. Next, the defocus mapping result and edge spread function (ESF) measuring result show that the distance between the primary and secondary mirror increases by 55.498 μm from the diffraction-limited condition. The shift of the focal plane is determined to be 5.813 mm shorter than that of the defocused focal plane, and this result is confirmed through the estimation of point spread function (PSF) measurements. This study shows that the earth system model combined with an instrument model is a powerful tool that can greatly help the development phase of instrument missions.

  9. On the algebraic theory of kink sectors: Application to quantum field theory models and collision theory

    International Nuclear Information System (INIS)

    Schlingemann, D.

    1996-10-01

    Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)

  10. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers. 

  11. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.

  12. EVOLUTION OF THEORIES AND EMPIRICAL MODELS OF A RELATIONSHIP BETWEEN ECONOMIC GROWTH, SCIENCE AND INNOVATIONS (PART I

    Directory of Open Access Journals (Sweden)

    Kaneva M. A.

    2017-12-01

    Full Text Available This article is a first chapter of an analytical review of existing theoretical models of a relationship between economic growth / GRP and indicators of scientific development and innovation activities, as well as empirical approaches to testing this relationship. Aim of the paper is a systematization of existing approaches to modeling of economic growth geared by science and innovations. The novelty of the current review lies in the authors’ criteria of interconnectedness of theoretical and empirical studies in the systematization of a wide range of publications presented in a final table-scheme. In the first part of the article the authors discuss evolution of theoretical approaches, while the second chapter presents a time gap between theories and their empirical verification caused by the level of development of quantitative instruments such as econometric models. The results of this study can be used by researchers and graduate students for familiarization with current scientific approaches that manifest progress from theory to empirical verification of a relationship «economic growth-innovations» for improvement of different types of models in spatial econometrics. To apply these models to management practices the presented review could be supplemented with new criteria for classification of knowledge production functions and other theories about effect of science on economic growth.

  13. Development and validation of the nasopharyngeal cancer scale among the system of quality of life instruments for cancer patients (QLICP-NA V2.0): combined classical test theory and generalizability theory.

    Science.gov (United States)

    Wu, Jiayuan; Hu, Liren; Zhang, Gaohua; Liang, Qilian; Meng, Qiong; Wan, Chonghua

    2016-08-01

    This research was designed to develop a nasopharyngeal cancer (NPC) scale based on quality of life (QOL) instruments for cancer patients (QLICP-NA). This scale was developed by using a modular approach and was evaluated by classical test and generalizability theories. Programmed decision procedures and theories on instrument development were applied to create QLICP-NA V2.0. A total of 121 NPC inpatients were assessed using QLICP-NA V2.0 to measure their QOL data from hospital admission until discharge. Scale validity, reliability, and responsiveness were evaluated by correlation, factor, parallel, multi-trait scaling, and t test analyses, as well as by generalizability (G) and decision (D) studies of the generalizability theory. Results of multi-trait scaling, correlation, factor, and parallel analyses indicated that QLICP-NA V2.0 exhibited good construct validity. The significant difference of QOL between the treated and untreated NPC patients indicated a good clinical validity of the questionnaire. The internal consistency (α) and test-retest reliability coefficients (intra-class correlations) of each domain, as well as the overall scale, were all >0.70. Ceiling effects were not found in all domains and most facets, except for common side effects (24.8 %) in the domain of common symptoms and side effects, tumor early symptoms (27.3 %) and therapeutic side effects (23.2 %) in specific domain, whereas floor effects did not exist in each domain/facet. The overall changes in the physical and social domains were significantly different between pre- and post-treatments with a moderate effective size (standard response mean) ranging from 0.21 to 0.27 (p theory. QLICP-NA V2.0 exhibited reasonable degrees of validity, reliability, and responsiveness. However, this scale must be further improved before it can be used as a practical instrument to evaluate the QOL of NPC patients in China.

  14. Working environment interventions – Bridging the gap between policy instruments and practice

    DEFF Research Database (Denmark)

    Hasle, Peter; Limborg, Hans Jørgen; Nielsen, Klaus T.

    2014-01-01

    is paid to why and how public and private organisations subsequently are to improve their working environment. This paper suggests a model which can bridge this gap. It is based on a combination of theories about basic policy instruments (regulation, incentives and information) with realistic analysis...... focusing on mechanisms and context, and finally institutional theory proposing coercive, normative and mimetic mechanisms as explanations for organisational behaviour. The model is applied to an intervention aimed at reduction of the risk of musculoskeletal disorders among bricklayers in Denmark. Our...

  15. Toric Methods in F-Theory Model Building

    Directory of Open Access Journals (Sweden)

    Johanna Knapp

    2011-01-01

    Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.

  16. Automatic creation of Markov models for reliability assessment of safety instrumented systems

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2008-01-01

    After the release of new international functional safety standards like IEC 61508, people care more for the safety and availability of safety instrumented systems. Markov analysis is a powerful and flexible technique to assess the reliability measurements of safety instrumented systems, but it is fallible and time-consuming to create Markov models manually. This paper presents a new technique to automatically create Markov models for reliability assessment of safety instrumented systems. Many safety related factors, such as failure modes, self-diagnostic, restorations, common cause and voting, are included in Markov models. A framework is generated first based on voting, failure modes and self-diagnostic. Then, repairs and common-cause failures are incorporated into the framework to build a complete Markov model. Eventual simplification of Markov models can be done by state merging. Examples given in this paper show how explosively the size of Markov model increases as the system becomes a little more complicated as well as the advancement of automatic creation of Markov models

  17. Ares I Scale Model Acoustic Tests Instrumentation for Acoustic and Pressure Measurements

    Science.gov (United States)

    Vargas, Magda B.; Counter, Douglas D.

    2011-01-01

    The Ares I Scale Model Acoustic Test (ASMAT) was a development test performed at the Marshall Space Flight Center (MSFC) East Test Area (ETA) Test Stand 116. The test article included a 5% scale Ares I vehicle model and tower mounted on the Mobile Launcher. Acoustic and pressure data were measured by approximately 200 instruments located throughout the test article. There were four primary ASMAT instrument suites: ignition overpressure (IOP), lift-off acoustics (LOA), ground acoustics (GA), and spatial correlation (SC). Each instrumentation suite incorporated different sensor models which were selected based upon measurement requirements. These requirements included the type of measurement, exposure to the environment, instrumentation check-outs and data acquisition. The sensors were attached to the test article using different mounts and brackets dependent upon the location of the sensor. This presentation addresses the observed effect of the sensors and mounts on the acoustic and pressure measurements.

  18. Non-linear σ-models and string theories

    International Nuclear Information System (INIS)

    Sen, A.

    1986-10-01

    The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs

  19. Lectures on algebraic model theory

    CERN Document Server

    Hart, Bradd

    2001-01-01

    In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.

  20. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...

  1. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  2. Developing evaluation instrument based on CIPP models on the implementation of portfolio assessment

    Science.gov (United States)

    Kurnia, Feni; Rosana, Dadan; Supahar

    2017-08-01

    This study aimed to develop an evaluation instrument constructed by CIPP model on the implementation of portfolio assessment in science learning. This study used research and development (R & D) method; adapting 4-D by the development of non-test instrument, and the evaluation instrument constructed by CIPP model. CIPP is the abbreviation of Context, Input, Process, and Product. The techniques of data collection were interviews, questionnaires, and observations. Data collection instruments were: 1) the interview guidelines for the analysis of the problems and the needs, 2) questionnaire to see level of accomplishment of portfolio assessment instrument, and 3) observation sheets for teacher and student to dig up responses to the portfolio assessment instrument. The data obtained was quantitative data obtained from several validators. The validators consist of two lecturers as the evaluation experts, two practitioners (science teachers), and three colleagues. This paper shows the results of content validity obtained from the validators and the analysis result of the data obtained by using Aikens' V formula. The results of this study shows that the evaluation instrument based on CIPP models is proper to evaluate the implementation of portfolio assessment instruments. Based on the experts' judgments, practitioners, and colleagues, the Aikens' V coefficient was between 0.86-1,00 which means that it is valid and can be used in the limited trial and operational field trial.

  3. Chern-Simons matrix models, two-dimensional Yang-Mills theory and the Sutherland model

    International Nuclear Information System (INIS)

    Szabo, Richard J; Tierz, Miguel

    2010-01-01

    We derive some new relationships between matrix models of Chern-Simons gauge theory and of two-dimensional Yang-Mills theory. We show that q-integration of the Stieltjes-Wigert matrix model is the discrete matrix model that describes q-deformed Yang-Mills theory on S 2 . We demonstrate that the semiclassical limit of the Chern-Simons matrix model is equivalent to the Gross-Witten model in the weak-coupling phase. We study the strong-coupling limit of the unitary Chern-Simons matrix model and show that it too induces the Gross-Witten model, but as a first-order deformation of Dyson's circular ensemble. We show that the Sutherland model is intimately related to Chern-Simons gauge theory on S 3 , and hence to q-deformed Yang-Mills theory on S 2 . In particular, the ground-state wavefunction of the Sutherland model in its classical equilibrium configuration describes the Chern-Simons free energy. The correspondence is extended to Wilson line observables and to arbitrary simply laced gauge groups.

  4. Hydrological models are mediating models

    Science.gov (United States)

    Babel, L. V.; Karssenberg, D.

    2013-08-01

    Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting

  5. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  6. Towards a Theory of Organisational Culture.

    Science.gov (United States)

    Owens, Robert G.; Steinhoff, Carl R.

    1989-01-01

    The development of the paper-and-pencil instrument called the Organizational Culture Assessment Inventory (OCAI) is based on the theory of organizational culture. Recent literature and organizational analysis are combined with Schein's model of organizational culture to provide the background for metaphorical analysis of organizational culture…

  7. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  8. Axial vibrations of brass wind instrument bells and their acoustical influence: Theory and simulations.

    Science.gov (United States)

    Kausel, Wilfried; Chatziioannou, Vasileios; Moore, Thomas R; Gorman, Britta R; Rokni, Michelle

    2015-06-01

    Previous work has demonstrated that structural vibrations of brass wind instruments can audibly affect the radiated sound. Furthermore, these broadband effects are not explainable by assuming perfect coincidence of the frequency of elliptical structural modes with air column resonances. In this work a mechanism is proposed that has the potential to explain the broadband influences of structural vibrations on acoustical characteristics such as input impedance, transfer function, and radiated sound. The proposed mechanism involves the coupling of axial bell vibrations to the internal air column. The acoustical effects of such axial bell vibrations have been studied by extending an existing transmission line model to include the effects of a parasitic flow into vibrating walls, as well as distributed sound pressure sources due to periodic volume fluctuations in a duct with oscillating boundaries. The magnitude of these influences in typical trumpet bells, as well as in a complete instrument with an unbraced loop, has been studied theoretically. The model results in predictions of input impedance and acoustical transfer function differences that are approximately 1 dB for straight instruments and significantly higher when coiled tubes are involved or when very thin brass is used.

  9. Narrative theories as computational models: reader-oriented theory and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Galloway, P.

    1983-12-01

    In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.

  10. A Non-Parametric Item Response Theory Evaluation of the CAGE Instrument Among Older Adults.

    Science.gov (United States)

    Abdin, Edimansyah; Sagayadevan, Vathsala; Vaingankar, Janhavi Ajit; Picco, Louisa; Chong, Siow Ann; Subramaniam, Mythily

    2018-02-23

    The validity of the CAGE using item response theory (IRT) has not yet been examined in older adult population. This study aims to investigate the psychometric properties of the CAGE using both non-parametric and parametric IRT models, assess whether there is any differential item functioning (DIF) by age, gender and ethnicity and examine the measurement precision at the cut-off scores. We used data from the Well-being of the Singapore Elderly study to conduct Mokken scaling analysis (MSA), dichotomous Rasch and 2-parameter logistic IRT models. The measurement precision at the cut-off scores were evaluated using classification accuracy (CA) and classification consistency (CC). The MSA showed the overall scalability H index was 0.459, indicating a medium performing instrument. All items were found to be homogenous, measuring the same construct and able to discriminate well between respondents with high levels of the construct and the ones with lower levels. The item discrimination ranged from 1.07 to 6.73 while the item difficulty ranged from 0.33 to 2.80. Significant DIF was found for 2-item across ethnic group. More than 90% (CC and CA ranged from 92.5% to 94.3%) of the respondents were consistently and accurately classified by the CAGE cut-off scores of 2 and 3. The current study provides new evidence on the validity of the CAGE from the IRT perspective. This study provides valuable information of each item in the assessment of the overall severity of alcohol problem and the precision of the cut-off scores in older adult population.

  11. Generalized algebra-valued models of set theory

    NARCIS (Netherlands)

    Löwe, B.; Tarafder, S.

    2015-01-01

    We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.

  12. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  13. The Development of a Tactical-Level Full Range Leadership Measurement Instrument

    Science.gov (United States)

    2010-03-01

    full range leadership theory has become established as the predominant and most widely researched theory on leadership . The most commonly used survey...instrument to assess full range leadership theory is the Multifactor Leadership Questionnaire, originally developed by Bass in 1985. Although much...existing literature to develop a new full range leadership theory measurement instrument that effectively targets low- to mid-level supervisors, or

  14. Nonparametric instrumental regression with non-convex constraints

    International Nuclear Information System (INIS)

    Grasmair, M; Scherzer, O; Vanhems, A

    2013-01-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition. (paper)

  15. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  16. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  17. From Landau's hydrodynamical model to field theory model to field theory models of multiparticle production: a tribute to Peter

    International Nuclear Information System (INIS)

    Cooper, F.

    1996-01-01

    We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations

  18. An instrument based on protection motivation theory to predict Chinese adolescents' intention to engage in protective behaviors against schistosomiasis.

    Science.gov (United States)

    Xiao, Han; Peng, Minjin; Yan, Hong; Gao, Mengting; Li, Jingjing; Yu, Bin; Wu, Hanbo; Li, Shiyue

    2016-01-01

    Further advancement in schistosomiasis prevention requires new tools to assess protective motivation, and promote innovative intervention program. This study aimed to develop and evaluate an instrument developed based on the Protection Motivation Theory (PMT) to predict protective behavior intention against schistosomiasis among adolescents in China. We developed the Schistosomiasis PMT Scale based on two appraisal pathways of protective motivation- threat appraisal pathway and coping appraisal pathway. Data from a large sample of middle school students ( n  = 2238, 51 % male, mean age 13.13 ± 1.10) recruited in Hubei, China was used to evaluated the validity and reliability of the scale. The final scale contains 18 items with seven sub-constructs. Cronbach's Alpha coefficients for the entire instrument was 0.76, and for the seven sub-constructs of severity, vulnerability, intrinsic reward, extrinsic reward, response efficacy, self-efficacy and response cost was 0.56, 0.82, 0.75, 0.80, 0.90, 0.72 and 0.70, respectively. The construct validity analysis revealed that the one level 7 sub-constructs model fitted data well (GFI = 0.98, CFI = 0.98, RMSEA = 0.03, Chi-sq/df = 3.90, p  motivation in schistosomiasis prevention control. Further studies are needed to develop more effective intervention programs for schistosomiasis prevention.

  19. Modeling students’ instrumental (mis-) use of substances to enhance cognitive performance: Neuroenhancement in the light of job demands-resources theory

    Science.gov (United States)

    2014-01-01

    Background Healthy university students have been shown to use psychoactive substances, expecting them to be functional means for enhancing their cognitive capacity, sometimes over and above an essentially proficient level. This behavior called Neuroenhancement (NE) has not yet been integrated into a behavioral theory that is able to predict performance. Job Demands Resources (JD-R) Theory for example assumes that strain (e.g. burnout) will occur and influence performance when job demands are high and job resources are limited at the same time. The aim of this study is to investigate whether or not university students’ self-reported NE can be integrated into JD-R Theory’s comprehensive approach to psychological health and performance. Methods 1,007 students (23.56 ± 3.83 years old, 637 female) participated in an online survey. Lifestyle drug, prescription drug, and illicit substance NE together with the complete set of JD-R variables (demands, burnout, resources, motivation, and performance) were measured. Path models were used in order to test our data’s fit to hypothesized main effects and interactions. Results JD-R Theory could successfully be applied to describe the situation of university students. NE was mainly associated with the JD-R Theory’s health impairment process: Lifestyle drug NE (p model NE as a means to reach an end (i.e. performance enhancement) rather than a target behavior itself. This is necessary to provide a deeper understanding of the behavioral roots and consequences of the phenomenon. PMID:24904687

  20. M-Theory Model-Building and Proton Stability

    CERN Document Server

    Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.

    1998-01-01

    We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  1. M-theory model-building and proton stability

    International Nuclear Information System (INIS)

    Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens

    1997-09-01

    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory

  2. Instrument for assessing coronary symptoms in women

    Directory of Open Access Journals (Sweden)

    Cándida Rosa Castañeda Valencia

    2015-09-01

    Full Text Available Objective: To design and validate an instrument for assessing the symptoms in women with coronary disease framed in the Theory of the Unpleasant Symptoms. Methodology: Methodological, psychometric study oriented by The symptoms, first concept of the Theory of Unpleasant Symptoms by Lenz et al. Theoretical critique of the construct chosen was performed proving usefulness in research and practice discipline. From the empirical, 260 evidences were weighted through methodological and empirical critique, applying the Integrative Review System articulated to the Empirical Conceptual Model by Fawcett Garity. Only 30 research pieces were obtained, used for the construction of the items. To the Lenz symptoms were added the reported psychosocial symptoms in women with coronary disease, generating a first design composed of 87 items. Results:The design was done by experts Content Validation with the Escobar and Cuervo Model 2008 (statistical analysis spss, 20 with Kendall Correlation Coefficient k = 0.682 (p ; 0.05 with good agreement between judges. Lawshe Model normalized by Tristán 2008 reported a Content Validity Ratio = 0.57 and Content Validity Index = 0.797, showing that items are units of essential analysis. Finally, Validation Facial made by means of the pilot test, conducted on 21 women who met the inclusion criteria, allowed the discrimination semiotics of items, obtaining an instrument consisting of 67 items.  Conclusions:This is a remnant of research that requires further validation to increase its psychometric capacity.

  3. Extended Nambu models: Their relation to gauge theories

    Science.gov (United States)

    Escobar, C. A.; Urrutia, L. F.

    2017-05-01

    Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.

  4. Testing of Environmental Satellite Bus-Instrument Interfaces Using Engineering Models

    Science.gov (United States)

    Gagnier, Donald; Hayner, Rick; Nosek, Thomas; Roza, Michael; Hendershot, James E.; Razzaghi, Andrea I.

    2004-01-01

    This paper discusses the formulation and execution of a laboratory test of the electrical interfaces between multiple atmospheric scientific instruments and the spacecraft bus that carries them. The testing, performed in 2002, used engineering models of the instruments and the Aura spacecraft bus electronics. Aura is one of NASA s Earth Observatory System missions. The test was designed to evaluate the complex interfaces in the command and data handling subsystems prior to integration of the complete flight instruments on the spacecraft. A problem discovered during the flight integration phase of the observatory can cause significant cost and schedule impacts. The tests successfully revealed problems and led to their resolution before the full-up integration phase, saving significant cost and schedule. This approach could be beneficial for future environmental satellite programs involving the integration of multiple, complex scientific instruments onto a spacecraft bus.

  5. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  6. Catastrophe Theory: A Unified Model for Educational Change.

    Science.gov (United States)

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  7. An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories

    International Nuclear Information System (INIS)

    Schiappa, Ricardo; Wyllard, Niclas

    2010-01-01

    We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.

  8. Matrix models as non-commutative field theories on R3

    International Nuclear Information System (INIS)

    Livine, Etera R

    2009-01-01

    In the context of spin foam models for quantum gravity, group field theories are a useful tool allowing on the one hand a non-perturbative formulation of the partition function and on the other hand admitting an interpretation as generalized matrix models. Focusing on 2d group field theories, we review their explicit relation to matrix models and show their link to a class of non-commutative field theories invariant under a quantum-deformed 3d Poincare symmetry. This provides a simple relation between matrix models and non-commutative geometry. Moreover, we review the derivation of effective 2d group field theories with non-trivial propagators from Boulatov's group field theory for 3d quantum gravity. Besides the fact that this gives a simple and direct derivation of non-commutative field theories for the matter dynamics coupled to (3d) quantum gravity, these effective field theories can be expressed as multi-matrix models with a non-trivial coupling between matrices of different sizes. It should be interesting to analyze this new class of theories, both from the point of view of matrix models as integrable systems and for the study of non-commutative field theories.

  9. MODELS AND THE DYNAMICS OF THEORIES

    Directory of Open Access Journals (Sweden)

    Paulo Abrantes

    2007-12-01

    Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.

  10. A mathematical model for describing the mechanical behaviour of root canal instruments.

    Science.gov (United States)

    Zhang, E W; Cheung, G S P; Zheng, Y F

    2011-01-01

    The purpose of this study was to establish a general mathematical model for describing the mechanical behaviour of root canal instruments by combining a theoretical analytical approach with a numerical finite-element method. Mathematical formulas representing the longitudinal (taper, helical angle and pitch) and cross-sectional configurations and area, the bending and torsional inertia, the curvature of the boundary point and the (geometry of) loading condition were derived. Torsional and bending stresses and the resultant deformation were expressed mathematically as a function of these geometric parameters, modulus of elasticity of the material and the applied load. As illustrations, three brands of NiTi endodontic files of different cross-sectional configurations (ProTaper, Hero 642, and Mani NRT) were analysed under pure torsion and pure bending situation by entering the model into a finite-element analysis package (ANSYS). Numerical results confirmed that mathematical models were a feasible method to analyse the mechanical properties and predict the stress and deformation for root canal instruments during root canal preparation. Mathematical and numerical model can be a suitable way to examine mechanical behaviours as a criterion of the instrument design and to predict the stress and strain experienced by the endodontic instruments during root canal preparation. © 2010 International Endodontic Journal.

  11. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....

  12. Development of collaborative-creative learning model using virtual laboratory media for instrumental analytical chemistry lectures

    Science.gov (United States)

    Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian

    2017-08-01

    The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.

  13. Topological quantum theories and integrable models

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.

    1991-01-01

    The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit

  14. Polyacetylene and relativistic field-theory models

    International Nuclear Information System (INIS)

    Bishop, A.R.; Campbell, D.K.; Fesser, K.

    1981-01-01

    Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed

  15. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  16. Development and evaluation of a social cognitive theory-based instrument to assess correlations for physical activity among people with spinal cord injury.

    Science.gov (United States)

    Wilroy, Jereme; Turner, Lori; Birch, David; Leaver-Dunn, Deidre; Hibberd, Elizabeth; Leeper, James

    2018-01-01

    People with spinal cord injury (SCI) are more susceptible to sedentary lifestyles because of the displacement of physical functioning and the copious barriers. Benefits of physical activity for people with SCI include physical fitness, functional capacity, social integration and psychological well-being. The purpose of this study was to develop and test a social cognitive theory-based instrument aimed to predict physical activity among people with SCI. An instrument was developed through the utilization and modification of previous items from the literature, an expert panel review, and cognitive interviewing, and tested among a sample of the SCI population using a cross-sectional design. Statistical analysis included descriptives, correlations, multiple regression, and exploratory factor analysis. The physical activity outcome variable was significantly and positively correlated with self-regulatory efficacy (r = 0.575), task self-efficacy (r = 0.491), self-regulation (r = 0.432), social support (r = 0.284), and outcome expectations (r = 0.247). Internal consistency for the constructs ranged from 0.82 to 0.96. Construct reliability values for the self-regulation (0.95), self-regulatory efficacy (0.96), task self-efficacy (0.94), social support (0.84), and outcome expectations (0.92) each exceeded the 0.70 a priori criteria. The factor analysis was conducted to seek modifications of current instrument to improve validity and reliability. The data provided support for the convergent validity of the five-factor SCT model. This study provides direction for further development of a valid and reliable instrument for predicting physical activity among people with SCI. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Study of parental models: building an instrument for their exploration

    Directory of Open Access Journals (Sweden)

    José Francisco Martínez Licona

    2014-08-01

    Full Text Available Objective: This research presents the construction of an attributional questionnaire concerning the different parental models and factors that are involved in family interactions. Method: A mixed methodology was used as a foundation to develop items and respective pilots that allowed checking the validity and internal consistency of the instrument using expert judgment. Results: An instrument of 36 statements was organized into 12 categories to explore the parental models according to the following factors: parental models, breeding patterns, attachment bonds and guidelines for success, and promoted inside family contexts. Analyzing these factors contributes to the children’s development within the familiar frown, and the opportunity for socio-educational intervention. Conclusion: It is assumed that the family context is as decisive as the school context; therefore, exploring the nature of parental models is required to understand the features and influences that contribute to the development of young people in any social context.

  18. Warped Linear Prediction of Physical Model Excitations with Applications in Audio Compression and Instrument Synthesis

    Science.gov (United States)

    Glass, Alexis; Fukudome, Kimitoshi

    2004-12-01

    A sound recording of a plucked string instrument is encoded and resynthesized using two stages of prediction. In the first stage of prediction, a simple physical model of a plucked string is estimated and the instrument excitation is obtained. The second stage of prediction compensates for the simplicity of the model in the first stage by encoding either the instrument excitation or the model error using warped linear prediction. These two methods of compensation are compared with each other, and to the case of single-stage warped linear prediction, adjustments are introduced, and their applications to instrument synthesis and MPEG4's audio compression within the structured audio format are discussed.

  19. Use of a life-size three-dimensional-printed spine model for pedicle screw instrumentation training.

    Science.gov (United States)

    Park, Hyun Jin; Wang, Chenyu; Choi, Kyung Ho; Kim, Hyong Nyun

    2018-04-16

    Training beginners of the pedicle screw instrumentation technique in the operating room is limited because of issues related to patient safety and surgical efficiency. Three-dimensional (3D) printing enables training or simulation surgery on a real-size replica of deformed spine, which is difficult to perform in the usual cadaver or surrogate plastic models. The purpose of this study was to evaluate the educational effect of using a real-size 3D-printed spine model for training beginners of the free-hand pedicle screw instrumentation technique. We asked whether the use of a 3D spine model can improve (1) screw instrumentation accuracy and (2) length of procedure. Twenty life-size 3D-printed lumbar spine models were made from 10 volunteers (two models for each volunteer). Two novice surgeons who had no experience of free-hand pedicle screw instrumentation technique were instructed by an experienced surgeon, and each surgeon inserted 10 pedicle screws for each lumbar spine model. Computed tomography scans of the spine models were obtained to evaluate screw instrumentation accuracy. The length of time in completing the procedure was recorded. The results of the latter 10 spine models were compared with those of the former 10 models to evaluate learning effect. A total of 37/200 screws (18.5%) perforated the pedicle cortex with a mean of 1.7 mm (range, 1.2-3.3 mm). However, the latter half of the models had significantly less violation than the former half (10/100 vs. 27/100, p 3D-printed spine model can be an excellent tool for training beginners of the free-hand pedicle screw instrumentation.

  20. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  1. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its stru....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its......, it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well...

  2. Toda theories, W-algebras, and minimal models

    International Nuclear Information System (INIS)

    Mansfield, P.; Spence, B.

    1991-01-01

    We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)

  3. Guide to mathematical concepts of quantum theory

    International Nuclear Information System (INIS)

    Heinosaari, T.; Ziman, M.

    2008-01-01

    Quantum Theory is one of the pillars of modern science developed over the last hundred years. In this review we introduce, step by step, the quantum theory understood as a mathematical model describing quantum experiments. We start with splitting the experiment into two parts: a preparation process and a measurement process leading to a registration of a particular outcome. These two ingredients of the experiment are represented by states and effects, respectively. Further, the whole picture of quantum measurement will be developed and concepts of observables, instruments and measurement models representing the three different descriptions on experiments will be introduced. In the second stage, we enrich the model of the experiment by introducing the concept of quantum channel describing the system changes between preparations and measurements. At the very end we review the elementary properties of quantum entanglement. The text contains many examples and exercise covering also many topics from quantum information theory and quantum measurement theory. The goal is to give a mathematically clear and self-containing explanation of the main concepts of the modern language of quantum theory (Authors)

  4. Guide to mathematical concepts of quantum theory

    International Nuclear Information System (INIS)

    Heinosaari, T.; Ziman, M.

    2008-01-01

    Quantum Theory is one of the pillars of modern science developed over the last hundred years. In this review paper we introduce, step by step, the quantum theory understood as a mathematical model describing quantum experiments. We start with splitting the experiment into two parts: a preparation process and a measurement process leading to a registration of a particular outcome. These two ingredients of the experiment are represented by states and effects, respectively. Further, the whole picture of quantum measurement will be developed and concepts of observables, instruments and measurement models representing the three different descriptions on experiments will be introduced. In the second stage, we enrich the model of the experiment by introducing the concept of quantum channel describing the system changes between preparations and measurements. At the very end we review the elementary properties of quantum entanglement. The text contains many examples and exercise covering also many topics from quantum information theory and quantum measurement theory. The goal is to give a mathematically clear and self-containing explanation of the main concepts of the modern language of quantum theory. (author)

  5. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  6. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    Science.gov (United States)

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  7. Contribution to the study of conformal theories and integrable models

    International Nuclear Information System (INIS)

    Sochen, N.

    1992-05-01

    The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved

  8. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  9. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai

    2014-05-01

    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  10. Job satisfaction and employee’s unionization decision: the mediating effect of perceived union instrumentality

    Energy Technology Data Exchange (ETDEWEB)

    Shan, H.; Hu, E.; Zhi, L.; Zhang, L.; Zhang, M.

    2016-07-01

    Purpose: Given the current lack of literature in the background of China labor force, this study aims to investigate the relationships among job satisfaction, perceived union instrumentality, and unionization from a reference-frame-based perspective and explore the referred relationships in the context of Chinese labor market. Design/methodology/approach: The study introduces perceived union instrumentality as a mediator to the relationship between job satisfaction and unionization. The applicability of western theories was tested in the Chinese context by a questionnaire survey on 390 employees who were working in private sectors of Jiangsu Province in China. Four hypothesis were proposed and tested by data analysis to verify the model. Findings: The study found that most aspects of job satisfaction were negatively correlated with unionization and perceived union instrumentality, while perceived union instrumentality had a positive relationship with unionization. Perceived union instrumentality was also found to have a mediating effect on the relationship between job satisfaction and unionization. Originality/value: The paper adapted and tested a number of western industrial relation theories in the backdrop of China, contributing to the gap in Chinese-context research by examining the relationships between job satisfaction, unionization and union instrumentality of Chinese employees. It pays a regular contribution to labor union studies both inside and outside China. (Author)

  11. Staircase Models from Affine Toda Field Theory

    CERN Document Server

    Dorey, P; Dorey, Patrick; Ravanini, Francesco

    1993-01-01

    We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.

  12. Psyche Mission: Scientific Models and Instrument Selection

    Science.gov (United States)

    Polanskey, C. A.; Elkins-Tanton, L. T.; Bell, J. F., III; Lawrence, D. J.; Marchi, S.; Park, R. S.; Russell, C. T.; Weiss, B. P.

    2017-12-01

    NASA has chosen to explore (16) Psyche with their 14th Discovery-class mission. Psyche is a 226-km diameter metallic asteroid hypothesized to be the exposed core of a planetesimal that was stripped of its rocky mantle by multiple hit and run collisions in the early solar system. The spacecraft launch is planned for 2022 with arrival at the asteroid in 2026 for 21 months of operations. The Psyche investigation has five primary scientific objectives: A. Determine whether Psyche is a core, or if it is unmelted material. B. Determine the relative ages of regions of Psyche's surface. C. Determine whether small metal bodies incorporate the same light elements as are expected in the Earth's high-pressure core. D. Determine whether Psyche was formed under conditions more oxidizing or more reducing than Earth's core. E. Characterize Psyche's topography. The mission's task was to select the appropriate instruments to meet these objectives. However, exploring a metal world, rather than one made of ice, rock, or gas, requires development of new scientific models for Psyche to support the selection of the appropriate instruments for the payload. If Psyche is indeed a planetary core, we expect that it should have a detectable magnetic field. However, the strength of the magnetic field can vary by orders of magnitude depending on the formational history of Psyche. The implications of both the extreme low-end and the high-end predictions impact the magnetometer and mission design. For the imaging experiment, what can the team expect for the morphology of a heavily impacted metal body? Efforts are underway to further investigate the differences in crater morphology between high velocity impacts into metal and rock to be prepared to interpret the images of Psyche when they are returned. Finally, elemental composition measurements at Psyche using nuclear spectroscopy encompass a new and unexplored phase space of gamma-ray and neutron measurements. We will present some end

  13. Exploratory and confirmatory factor analysis of the Adolescent Motivation to Cook Questionnaire: A Self-Determination Theory instrument.

    Science.gov (United States)

    Miketinas, Derek; Cater, Melissa; Bailey, Ariana; Craft, Brittany; Tuuri, Georgianna

    2016-10-01

    Increasing adolescents' motivation and competence to cook may improve diet quality and reduce the risk for obesity and chronic diseases. The objective of this study was to develop an instrument to measure adolescents' intrinsic motivation to prepare healthy foods and the four psychological needs that facilitate motivation identified by the Self Determination Theory (SDT). Five hundred ninety-three high school students (62.7% female) were recruited to complete the survey. Participants indicated to what extent they agreed or disagreed with 25 statements pertaining to intrinsic motivation and perceived competence to cook, and their perceived autonomy support, autonomy, and relatedness to teachers and classmates. Data were analyzed using exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and internal consistency reliability. EFA returned a five-factor structure explaining 65.3% of the variance; and CFA revealed that the best model fit was a five-factor structure (χ2 = 524.97 (265); Comparative Fit Index = 0.93; RMSEA = 0.056; and SRMR = 0.04). The sub-scales showed good internal consistency (Intrinsic Motivation: α = 0.94; Perceived Competence: α = 0.92; Autonomy Support: α = 0.94; Relatedness: α = 0.90; and Autonomy: α = 0.85). These results support the application of the Adolescent Motivation to Cook Questionnaire to measure adolescents' motivation and perceived competence to cook, autonomy support by their instructor, autonomy in the classroom, and relatedness to peers. Further studies are needed to investigate whether this instrument can measure change in cooking intervention programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Modelling the liquidity ratio as macroprudential instrument

    OpenAIRE

    Jan Willem van den End; Mark Kruidhof

    2012-01-01

    The Basel III Liquidity Coverage Ratio (LCR) is a microprudential instrument to strengthen the liquidity position of banks. However, if in extreme scenarios the LCR becomes a binding constraint, the interaction of bank behaviour with the regulatory rule can have negative externalities. We simulate the systemic implications of the LCR by a liquidity stress-testing model, which takes into account the impact of bank reactions on second round feedback effects. We show that a flexible approach of ...

  15. Irreducible integrable theories form tensor products of conformal models

    International Nuclear Information System (INIS)

    Mathur, S.D.; Warner, N.P.

    1991-01-01

    By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)

  16. Scientific instruments, scientific progress and the cyclotron

    International Nuclear Information System (INIS)

    Baird, David; Faust, Thomas

    1990-01-01

    Philosophers speak of science in terms of theory and experiment, yet when they speak of the progress of scientific knowledge they speak in terms of theory alone. In this article it is claimed that scientific knowledge consists of, among other things, scientific instruments and instrumental techniques and not simply of some kind of justified beliefs. It is argued that one aspect of scientific progress can be characterized relatively straightforwardly - the accumulation of new scientific instruments. The development of the cyclotron is taken to illustrate this point. Eight different activities which promoted the successful completion of the cyclotron are recognised. The importance is in the machine rather than the experiments which could be run on it and the focus is on how the cyclotron came into being, not how it was subsequently used. The completed instrument is seen as a useful unit of scientific progress in its own right. (UK)

  17. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto

    2008-01-01

    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  18. An extension of the theory of planned behavior to predict pedestrians' violating crossing behavior using structural equation modeling.

    Science.gov (United States)

    Zhou, Hongmei; Romero, Stephanie Ballon; Qin, Xiao

    2016-10-01

    This paper aimed to examine pedestrians' self-reported violating crossing behavior intentions by applying the theory of planned behavior (TPB). We studied the behavior intentions regarding instrumental attitude, subjective norm, perceived behavioral control, the three basic components of TPB, and extended the theory by adding new factors including descriptive norm, perceived risk and conformity tendency to evaluate their respective impacts on pedestrians' behavior intentions. A questionnaire presented with a scenario that pedestrians crossed the road violating the pedestrian lights at an intersection was designed, and the survey was conducted in Dalian, China. Based on the 260 complete and valid responses, reliability and validity of the data for each question was evaluated. The data were then analyzed by using the structural equation modeling (SEM). The results showed that people had a negative attitude toward the behavior of violating road-crossing rules; they perceived social influences from their family and friends; and they believed that this kind of risky behavior would potentially harm them in a traffic accident. The results also showed that instrumental attitude and subjective norm were significant in the basic TPB model. After adding descriptive norm, subjective norm was no more significant. Other models showed that conformity tendency was a strong predictor, indicating that the presence of other pedestrians would influence behavioral intention. The findings could help to design more effective interventions and safety campaigns, such as changing people's attitude toward this violation behavior, correcting the social norms, increasing their safety awareness, etc. in order to reduce pedestrians' road crossing violations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Evaluating the effectiveness of impact assessment instruments

    DEFF Research Database (Denmark)

    Cashmore, Matthew; Richardson, Tim; Hilding-Ryedvik, Tuija

    2010-01-01

    to sharpen effectiveness evaluation theory for impact assessment instruments this article critically examines the neglected issue of their political constitution. Analytical examples are used to concretely explore the nature and significance of the politicisation of impact assessment. It is argued......The central role of impact assessment instruments globally in policy integration initiatives has been cemented in recent years. Associated with this trend, but also reflecting political emphasis on greater accountability in certain policy sectors and a renewed focus on economic competitiveness...... that raising awareness about the political character of impact assessment instruments, in itself, is a vital step in advancing effectiveness evaluation theory. Broader theoretical lessons on the framing of evaluation research are also drawn from the political analysis. We conclude that, at least within...

  20. Models and theories of prescribing decisions: A review and suggested a new model.

    Science.gov (United States)

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  1. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  2. Development of a dynamic computational model of social cognitive theory.

    Science.gov (United States)

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  3. Theory and modeling group

    Science.gov (United States)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  4. Mobile Music, Sensors, Physical Modeling, and Digital Fabrication: Articulating the Augmented Mobile Instrument

    Directory of Open Access Journals (Sweden)

    Romain Michon

    2017-12-01

    Full Text Available Two concepts are presented, extended, and unified in this paper: mobile device augmentation towards musical instruments design and the concept of hybrid instruments. The first consists of using mobile devices at the heart of novel musical instruments. Smartphones and tablets are augmented with passive and active elements that can take part in the production of sound (e.g., resonators, exciter, etc., add new affordances to the device, or change its global aesthetics and shape. Hybrid instruments combine physical/acoustical and “physically informed” virtual/digital elements. Recent progress in physical modeling of musical instruments and digital fabrication is exploited to treat instrument parts in a multidimensional way, allowing any physical element to be substituted with a virtual one and vice versa (as long as it is physically possible. A wide range of tools to design mobile hybrid instruments is introduced and evaluated. Aesthetic and design considerations when making such instruments are also presented through a series of examples.

  5. Validation of self-directed learning instrument and establishment of normative data for nursing students in taiwan: using polytomous item response theory.

    Science.gov (United States)

    Cheng, Su-Fen; Lee-Hsieh, Jane; Turton, Michael A; Lin, Kuan-Chia

    2014-06-01

    Little research has investigated the establishment of norms for nursing students' self-directed learning (SDL) ability, recognized as an important capability for professional nurses. An item response theory (IRT) approach was used to establish norms for SDL abilities valid for the different nursing programs in Taiwan. The purposes of this study were (a) to use IRT with a graded response model to reexamine the SDL instrument, or the SDLI, originally developed by this research team using confirmatory factor analysis and (b) to establish SDL ability norms for the four different nursing education programs in Taiwan. Stratified random sampling with probability proportional to size was used. A minimum of 15% of students from the four different nursing education degree programs across Taiwan was selected. A total of 7,879 nursing students from 13 schools were recruited. The research instrument was the 20-item SDLI developed by Cheng, Kuo, Lin, and Lee-Hsieh (2010). IRT with the graded response model was used with a two-parameter logistic model (discrimination and difficulty) for the data analysis, calculated using MULTILOG. Norms were established using percentile rank. Analysis of item information and test information functions revealed that 18 items exhibited very high discrimination and two items had high discrimination. The test information function was higher in this range of scores, indicating greater precision in the estimate of nursing student SDL. Reliability fell between .80 and .94 for each domain and the SDLI as a whole. The total information function shows that the SDLI is appropriate for all nursing students, except for the top 2.5%. SDL ability norms were established for each nursing education program and for the nation as a whole. IRT is shown to be a potent and useful methodology for scale evaluation. The norms for SDL established in this research will provide practical standards for nursing educators and students in Taiwan.

  6. The Father Friendly Initiative within Families: Using a logic model to develop program theory for a father support program.

    Science.gov (United States)

    Gervais, Christine; de Montigny, Francine; Lacharité, Carl; Dubeau, Diane

    2015-10-01

    The transition to fatherhood, with its numerous challenges, has been well documented. Likewise, fathers' relationships with health and social services have also begun to be explored. Yet despite the problems fathers experience in interactions with healthcare services, few programs have been developed for them. To explain this, some authors point to the difficulty practitioners encounter in developing and structuring the theory of programs they are trying to create to promote and support father involvement (Savaya, R., & Waysman, M. (2005). Administration in Social Work, 29(2), 85), even when such theory is key to a program's effectiveness (Chen, H.-T. (2005). Practical program evaluation. Thousand Oaks, CA: Sage Publications). The objective of the present paper is to present a tool, the logic model, to bridge this gap and to equip practitioners for structuring program theory. This paper addresses two questions: (1) What would be a useful instrument for structuring the development of program theory in interventions for fathers? (2) How would the concepts of a father involvement program best be organized? The case of the Father Friendly Initiative within Families (FFIF) program is used to present and illustrate six simple steps for developing a logic model that are based on program theory and demonstrate its relevance. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  7. Tests of Theories of Crime in Female Prisoners.

    Science.gov (United States)

    Lindberg, Marc A; Fugett, April; Adkins, Ashtin; Cook, Kelsey

    2017-02-01

    Several general theories of crime were tested with path models on 293 female prisoners in a U.S. State prison. The theories tested included Social Bond and Control, Thrill/Risk Seeking, and a new attachment-based Developmental Dynamic Systems model. A large battery of different instruments ranging from measures of risk taking, to a crime addiction scale, to Childhood Adverse Events, to attachments and clinical issues were used. The older general theories of crime did not hold up well under the rigor of path modeling. The new dynamic systems model was supported that incorporated adverse childhood events leading to (a) peer crime, (b) crime addiction, and (c) a measure derived from the Attachment and Clinical Issues Questionnaire (ACIQ) that takes individual differences in attachments and clinical issues into account. The results were discussed in terms of new approaches to Research Defined Criteria of Diagnosis (RDoC) and new approaches to intervention.

  8. A 'theory of everything'? [Extending the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1993-01-01

    The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)

  9. Introduction to gauge theories and the Standard Model

    CERN Document Server

    de Wit, Bernard

    1995-01-01

    The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.

  10. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  11. Flyover Modeling of Planetary Pits - Undergraduate Student Instrument Project

    Science.gov (United States)

    Bhasin, N.; Whittaker, W.

    2015-12-01

    On the surface of the moon and Mars there are hundreds of skylights, which are collapsed holes that are believed to lead to underground caves. This research uses Vision, Inertial, and LIDAR sensors to build a high resolution model of a skylight as a landing vehicle flies overhead. We design and fabricate a pit modeling instrument to accomplish this task, implement software, and demonstrate sensing and modeling capability on a suborbital reusable launch vehicle flying over a simulated pit. Future missions on other planets and moons will explore pits and caves, led by the technology developed by this research. Sensor software utilizes modern graph-based optimization techniques to build 3D models using camera, LIDAR, and inertial data. The modeling performance was validated with a test flyover of a planetary skylight analog structure on the Masten Xombie sRLV. The trajectory profile closely follows that of autonomous planetary powered descent, including translational and rotational dynamics as well as shock and vibration. A hexagonal structure made of shipping containers provides a terrain feature that serves as an appropriate analog for the rim and upper walls of a cylindrical planetary skylight. The skylight analog floor, walls, and rim are modeled in elevation with a 96% coverage rate at 0.25m2 resolution. The inner skylight walls have 5.9cm2 color image resolution and the rims are 6.7cm2 with measurement precision superior to 1m. The multidisciplinary student team included students of all experience levels, with backgrounds in robotics, physics, computer science, systems, mechanical and electrical engineering. The team was commited to authentic scientific experimentation, and defined specific instrument requirements and measurable experiment objectives to verify successful completion.This work was made possible by the NASA Undergraduate Student Instrument Project Educational Flight Opportunity 2013 program. Additional support was provided by the sponsorship of an

  12. Effective field theory and the quark model

    International Nuclear Information System (INIS)

    Durand, Loyal; Ha, Phuoc; Jaczko, Gregory

    2001-01-01

    We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections

  13. Minisuperspace models in histories theory

    International Nuclear Information System (INIS)

    Anastopoulos, Charis; Savvidou, Ntina

    2005-01-01

    We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context

  14. Quantum Link Models and Quantum Simulation of Gauge Theories

    International Nuclear Information System (INIS)

    Wiese, U.J.

    2015-01-01

    This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)

  15. The instrumentation of fast reactor

    International Nuclear Information System (INIS)

    Endo, Akira

    2003-03-01

    The author has been engaged in the development of fast reactors over the last 30 years with both an involvement with the early technology development on the experimental breeder reactor Joyo, and latterly continuing this work on the prototype breeder reactor, Monju. In order to pass on this experience to younger engineers this paper is produced to outline this experience in the sincere hope that the information given will be utilised in future educational training material. The paper discusses the wide diversity on the associated instrument technology which the fast breeder reactor requires. The first chapter outlines the fast reactor system, followed by discussions on reactor instrumentation, measurement principles, temperature dependencies, and verification response characteristics from various viewpoints, are discussed in chapters two and three. The important issues of failed fuel location detection, and sodium leak detection from steam generators are discussed in chapters 4 and 5 respectively. Appended to this report is an explanation on the methods of measuring response characteristics on instrumentation systems using error analysis, random signal theory and measuring method of response characteristic by AR (autoregressive) model on which it appears is becoming an indispensable problem for persons involved with this technology in the future. (author)

  16. Matrix model as a mirror of Chern-Simons theory

    International Nuclear Information System (INIS)

    Aganagic, Mina; Klemm, Albrecht; Marino, Marcos; Vafa, Cumrun

    2004-01-01

    Using mirror symmetry, we show that Chern-Simons theory on certain manifolds such as lens spaces reduces to a novel class of Hermitian matrix models, where the measure is that of unitary matrix models. We show that this agrees with the more conventional canonical quantization of Chern-Simons theory. Moreover, large N dualities in this context lead to computation of all genus A-model topological amplitudes on toric Calabi-Yau manifolds in terms of matrix integrals. In the context of type IIA superstring compactifications on these Calabi-Yau manifolds with wrapped D6 branes (which are dual to M-theory on G2 manifolds) this leads to engineering and solving F-terms for N=1 supersymmetric gauge theories with superpotentials involving certain multi-trace operators. (author)

  17. A QCD Model Using Generalized Yang-Mills Theory

    International Nuclear Information System (INIS)

    Wang Dianfu; Song Heshan; Kou Lina

    2007-01-01

    Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.

  18. Development and validation of an instrument to assess perceived social influence on health behaviors

    Science.gov (United States)

    HOLT, CHERYL L.; CLARK, EDDIE M.; ROTH, DAVID L.; CROWTHER, MARTHA; KOHLER, CONNIE; FOUAD, MONA; FOUSHEE, RUSTY; LEE, PATRICIA A.; SOUTHWARD, PENNY L.

    2012-01-01

    Assessment of social influence on health behavior is often approached through a situational context. The current study adapted an existing, theory-based instrument from another content domain to assess Perceived Social Influence on Health Behavior (PSI-HB) among African Americans, using an individual difference approach. The adapted instrument was found to have high internal reliability (α = .81–.84) and acceptable testretest reliability (r = .68–.85). A measurement model revealed a three-factor structure and supported the theoretical underpinnings. Scores were predictive of health behaviors, particularly among women. Future research using the new instrument may have applied value assessing social influence in the context of health interventions. PMID:20522506

  19. Item level diagnostics and model - data fit in item response theory ...

    African Journals Online (AJOL)

    Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...

  20. Asteroid electrostatic instrumentation and modelling

    Energy Technology Data Exchange (ETDEWEB)

    Aplin, K L; Bowles, N E; Urbak, E [Department of Physics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Keane, D; Sawyer, E C, E-mail: k.aplin1@physics.ox.ac.uk [RAL Space, R25, Harwell Oxford, Didcot OX11 0QX (United Kingdom)

    2011-06-23

    Asteroid surface material is expected to become photoelectrically charged, and is likely to be transported through electrostatic levitation. Understanding any movement of the surface material is relevant to proposed space missions to return samples to Earth for detailed isotopic analysis. Motivated by preparations for the Marco Polo sample return mission, we present electrostatic modelling for a real asteroid, Itokawa, for which detailed shape information is available, and verify that charging effects are likely to be significant at the terminator and at the edges of shadow regions for the Marco Polo baseline asteroid, 1999JU3. We also describe the Asteroid Charge Experiment electric field instrumentation intended for Marco Polo. Finally, we find that the differing asteroid and spacecraft potentials on landing could perturb sample collection for the short landing time of 20min that is currently planned.

  1. Advanced instrumentation for Solar System gravitational physics

    Science.gov (United States)

    Peron, Roberto; Bellettini, G.; Berardi, S.; Boni, A.; Cantone, C.; Coradini, A.; Currie, D. G.; Dell'Agnello, S.; Delle Monache, G. O.; Fiorenza, E.; Garattini, M.; Iafolla, V.; Intaglietta, N.; Lefevre, C.; Lops, C.; March, R.; Martini, M.; Nozzoli, S.; Patrizi, G.; Porcelli, L.; Reale, A.; Santoli, F.; Tauraso, R.; Vittori, R.

    2010-05-01

    The Solar System is a complex laboratory for testing gravitational physics. Indeed, its scale and hierarchical structure make possible a wide range of tests for gravitational theories, studying the motion of both natural and artificial objects. The usual methodology makes use of tracking information related to the bodies, fitted by a suitable dynamical model. Different equations of motion are provided by different theories, which can be therefore tested and compared. Future exploration scenarios show the possibility of placing deep-space probes near the Sun or in outer Solar System, thereby extending the available experimental data sets. In particular, the Earth-Moon is the most accurately known gravitational three-body laboratory, which is undergoing a new, strong wave of research and exploration (both robotic and manned). In addition, the benefits of a synergetic study of planetary science and gravitational physics are of the greatest importance (as shown by the success of the Apollo program), especially in the Earth-Moon, Mars-Phobos, Jovian and Saturnian sub-suystems. This scenarios open critical issues regarding the quality of the available dynamical models, i.e. their capability of fitting data without an excessive number of empirical hypotheses. A typical case is represented by the non-gravitational phenomena, which in general are difficult to model. More generally, gravitation tests with Lunar Laser Ranging, inner or outer Solar System probes and the appearance of the so-called 'anomalies'(like the one indicated by the Pioneers), whatever their real origin (either instrumental effects or due to new physics), show the necessity of a coordinated improvement of tracking and modelization techniques. A common research path will be discussed, employing the development and use of advanced instrumentation to cope with current limitations of Solar System gravitational tests. In particular, the use of high-sensitivity accelerometers, combined with microwave and laser

  2. Notes on instrumentation and control

    CERN Document Server

    Roy, G J

    2013-01-01

    Notes on Instrumentation and Control presents topics on pressure (i.e., U-tube manometers and elastic type gauges), temperature (i.e. glass thermometer, bi-metallic strip thermometer, filled system thermometer, vapor pressure thermometer), level, and flow measuring devices. The book describes other miscellaneous instruments, signal transmitting devices, supply and control systems, and monitoring systems. The theory of automatic control and semi-conductor devices are also considered. Marine engineers will find the book useful.

  3. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....

  4. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    Science.gov (United States)

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  5. A decision model for financial assurance instruments in the upstream petroleum sector

    International Nuclear Information System (INIS)

    Ferreira, Doneivan; Suslick, Saul; Farley, Joshua; Costanza, Robert; Krivov, Sergey

    2004-01-01

    The main objective of this paper is to deepen the discussion regarding the application of financial assurance instruments, bonds, in the upstream oil sector. This paper will also attempt to explain the current choice of instruments within the sector. The concepts of environmental damages and internalization of environmental and regulatory costs will be briefly explored. Bonding mechanisms are presently being adopted by several governments with the objective of guaranteeing the availability of funds for end-of-leasing operations. Regulators are mainly concerned with the prospect of inheriting liabilities from lessees. Several forms of bonding instruments currently available were identified and a new instrument classification was proposed. Ten commonly used instruments were selected and analyzed under the perspective of both regulators and industry (surety, paid-in and periodic-payment collateral accounts, letters of credit, self-guarantees, investment grade securities, real estate collaterals, insurance policies, pools, and special funds). A multiattribute value function model was then proposed to examine current instrument preferences. Preliminary simulations confirm the current scenario where regulators are likely to require surety bonds, letters of credit, and periodic payment collateral account tools

  6. Reconstructing bidimensional scalar field theory models

    International Nuclear Information System (INIS)

    Flores, Gabriel H.; Svaiter, N.F.

    2001-07-01

    In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)

  7. Quantum-like model of unconscious–conscious dynamics

    Science.gov (United States)

    Khrennikov, Andrei

    2015-01-01

    We present a quantum-like model of sensation–perception dynamics (originated in Helmholtz theory of unconscious inference) based on the theory of quantum apparatuses and instruments. We illustrate our approach with the model of bistable perception of a particular ambiguous figure, the Schröder stair. This is a concrete model for unconscious and conscious processing of information and their interaction. The starting point of our quantum-like journey was the observation that perception dynamics is essentially contextual which implies impossibility of (straightforward) embedding of experimental statistical data in the classical (Kolmogorov, 1933) framework of probability theory. This motivates application of nonclassical probabilistic schemes. And the quantum formalism provides a variety of the well-approved and mathematically elegant probabilistic schemes to handle results of measurements. The theory of quantum apparatuses and instruments is the most general quantum scheme describing measurements and it is natural to explore it to model the sensation–perception dynamics. In particular, this theory provides the scheme of indirect quantum measurements which we apply to model unconscious inference leading to transition from sensations to perceptions. PMID:26283979

  8. Two-matrix models and c =1 string theory

    International Nuclear Information System (INIS)

    Bonora, L.; Xiong Chuansheng

    1994-05-01

    We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)

  9. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  10. A simple model explaining super-resolution in absolute optical instruments

    Science.gov (United States)

    Leonhardt, Ulf; Sahebdivan, Sahar; Kogan, Alex; Tyc, Tomáš

    2015-05-01

    We develop a simple, one-dimensional model for super-resolution in absolute optical instruments that is able to describe the interplay between sources and detectors. Our model explains the subwavelength sensitivity of a point detector to a point source reported in previous computer simulations and experiments (Miñano 2011 New J. Phys.13 125009; Miñano 2014 New J. Phys.16 033015).

  11. Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory

    International Nuclear Information System (INIS)

    Chung, S.; Tye, S.H.

    1993-01-01

    The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory

  12. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  13. Functional capabilities of the breadboard model of SIDRA satellite-borne instrument

    International Nuclear Information System (INIS)

    Dudnik, O.V.; Kurbatov, E.V.; Titov, K.G.; Prieto, M.; Sanchez, S.; Sylwester, J.; Gburek, S.; Podgorski, P.

    2013-01-01

    This paper presents the structure, principles of operation and functional capabilities of the breadboard model of SIDRA compact satellite-borne instrument. SIDRA is intended for monitoring fluxes of high-energy charged particles under outer-space conditions. We present the reasons to develop a particle spectrometer and we list the main objectives to be achieved with the help of this instrument. The paper describes the major specifications of the analog and digital signal processing units of the breadboard model. A specially designed and developed data processing module based on the Actel ProAsic3E A3PE3000 FPGA is presented and compared with the all-in one digital processing signal board based on the Xilinx Spartan 3 XC3S1500 FPGA.

  14. Crisis in Context Theory: An Ecological Model

    Science.gov (United States)

    Myer, Rick A.; Moore, Holly B.

    2006-01-01

    This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…

  15. Chern-Simons Theory, Matrix Models, and Topological Strings

    International Nuclear Information System (INIS)

    Walcher, J

    2006-01-01

    This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may

  16. Increasing Laser Stability with Improved Electronic Instruments

    Science.gov (United States)

    Troxel, Daylin; Bennett, Aaron; Erickson, Christopher J.; Jones, Tyler; Durfee, Dallin S.

    2010-03-01

    We present several electronic instruments developed to implement an ultra-stable laser lock. These instruments include a high speed, low noise homodyne photo-detector; an ultrahigh stability, low noise current driver with high modulation bandwidth and digital control; a high-speed, low noise PID controller; a low-noise piezo driver; and a laser diode temperature controller. We will present the theory of operation for these instruments, design and construction techniques, and essential characteristics for each device.

  17. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  18. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....

  19. Development and validation of the irritable bowel syndrome scale under the system of quality of life instruments for chronic diseases QLICD-IBS: combinations of classical test theory and generalizability theory.

    Science.gov (United States)

    Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua

    2014-10-01

    This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.

  20. The dynamic model of choosing an external funding instrument

    Directory of Open Access Journals (Sweden)

    Irena HONKOVA

    2015-06-01

    Full Text Available Making a decision about using a specific funding source is one of the most important tasks of financial management. The utilization of external sources features numerous advantages yet staying aware of diverse funding options is not easy for financial managers. Today it is crucial to quickly identify an optimum possibility and to make sure that all relevant criteria have been considered and no variant has been omitted. Over the long term it is also necessary to consider the category of time as changes made today do not affect only the current variables but they also have a significant impact on the future. This article aims to identify the most suitable model of choosing external funding sources that would describe the dynamics involved. The first part of the paper considers the theoretical background of external funding instrument and of decision criteria. The making of financial decisions is a process consisted of weighing the most suitable variants, selecting the best variant, and controlling the implementation of accepted proposals. The second part analyses results of the research - decisive weights of the criteria. Then it is created the model of the principal criteria Weighted Average Cost of Capital (Dynamic model WACC. Finally it is created the Dynamic Model of Choosing an External Funding Instrument. The created decision-making model facilitates the modeling of changes in time because it is crucial to know what future consequences lies in decisions made the contemporary turbulent world. Each variant features possible negative and positive changes of varying extent. The possibility to simulate these changes can illustrate an optimal variant to a decision-maker.

  1. Instrumental analysis, second edition

    International Nuclear Information System (INIS)

    Christian, G.D.; O'Reilly, J.E.

    1988-01-01

    The second edition of Instrumental Analysis is a survey of the major instrument-based methods of chemical analysis. It appears to be aimed at undergraduates but would be equally useful in a graduate course. The volume explores all of the classical quantitative methods and contains sections on techniques that usually are not included in a semester course in instrumentation (such as electron spectroscopy and the kinetic methods). Adequate coverage of all of the methods contained in this book would require several semesters of focused study. The 25 chapters were written by different authors, yet the style throughout the book is more uniform than in the earlier edition. With the exception of a two-chapter course in analog and digital circuits, the book purports to de-emphasize instrumentation, focusing more on the theory behind the methods and the application of the methods to analytical problems. However, a detailed analysis of the instruments used in each method is by no means absent. The book has the favor of a user's guide to analysis

  2. Risk Implications of Energy Policy Instruments

    DEFF Research Database (Denmark)

    Kitzing, Lena

    papers and a working paper), based on a combination of micro-economic and policy analysis. Financial theory is used for the quantitative analysis of investment problems under uncertainty, including mean-variance portfolio theory, real option analysis, Monte Carlo simulations and time series analysis...... show, both qualitatively and quantitatively, that policy makers cannot neglect risk implications when designing RES support instruments without compromising either on effectiveness or cost-efficiency of energy policy. The central research questions are: how can risk implications of RES policy...... instruments be integrated into policy design, so that the policies provide adequate investment incentives? And can the consideration of such risk implications in policy design make overall energy policy more successful? These questions are answered in seven research papers (four journal papers, two conference...

  3. A Leadership Identity Development Model: Applications from a Grounded Theory

    Science.gov (United States)

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.

    2006-01-01

    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  4. Electromigration techniques theory and practice

    CERN Document Server

    Dziubakiewicz, Ewelina; Szumski, Michal

    2013-01-01

    The book provides the broad knowledge on electromigration techniques including: theory of CE, description of instrumentation, theory and practice in micellar electrokinetic chromatography, isotachophoresis, capillary isoelectric focusing, capillary and planar electrochromatography (including description of instrumentation and packed and monolithic column preparation), 2D-gel electrophoresis (including sample preparation) and lab-on-a-chip systems. The book also provides the most recent examples of applications including food, environmental, pharmaceutical analysis as well as proteomics.

  5. A review of organizational buyer behaviour models and theories ...

    African Journals Online (AJOL)

    Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...

  6. Topos models for physics and topos theory

    International Nuclear Information System (INIS)

    Wolters, Sander

    2014-01-01

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos

  7. Finite Unification: Theory, Models and Predictions

    CERN Document Server

    Heinemeyer, S; Zoupanos, G

    2011-01-01

    All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...

  8. Scattering and short-distance properties in field theory models

    International Nuclear Information System (INIS)

    Iagolnitzer, D.

    1987-01-01

    The aim of constructive field theory is not only to define models but also to establish their general properties of physical interest. We here review recent works on scattering and on short-distance properties for weakly coupled theories with mass gap such as typically P(φ) in dimension 2, φ 4 in dimension 3 and the (renormalizable, asymptotically free) massive Gross-Neveu (GN) model in dimension 2. Many of the ideas would apply similarly to other (possibly non renormalizable) theories that might be defined in a similar way via phase-space analysis

  9. Finite Difference Time Domain Modeling at USA Instruments, Inc.

    Science.gov (United States)

    Curtis, Richard

    2003-10-01

    Due to the competitive nature of the commercial MRI industry, it is essential for the financial health of a participating company to innovate new coil designs and bring product to market rapidly in response to ever-changing market conditions. However, the technology of MRI coil design is still early in its stage of development and its principles are yet evolving. As a result, it is not always possible to know the relevant electromagnetic effects of a given design since the interaction of coil elements is complex and often counter-intuitive. Even if the effects are known qualitatively, the quantitative results are difficult to obtain. At USA Instruments, Inc., the acquisition of the XFDTDâ electromagnetic simulation tool from REMCOM, Inc., has been helpful in determining the electromagnetic performance characteristics of existing coil designs in the prototype stage before the coils are released for production. In the ideal case, a coil design would be modeled earlier at the conceptual stage, so that only good designs will make it to the prototyping stage and the electromagnetic characteristics better understood very early in the design process and before the testing stage has begun. This paper is a brief overview of using FDTD modeling for MRI coil design at USA Instruments, Inc., and shows some of the highlights of recent FDTD modeling efforts on Birdcage coils, a staple of the MRI coil design portfolio.

  10. Methods and Models of Market Risk Stress-Testing of the Portfolio of Financial Instruments

    Directory of Open Access Journals (Sweden)

    Alexander M. Karminsky

    2015-01-01

    Full Text Available Amid instability of financial markets and macroeconomic situation the necessity of improving bank risk-management instrument arises. New economic reality defines the need for searching for more advanced approaches of estimating banks vulnerability to exceptional, but plausible events. Stress-testing belongs to such instruments. The paper reviews and compares the models of market risk stress-testing of the portfolio of different financial instruments. These days the topic of the paper is highly acute due to the fact that now stress-testing is becoming an integral part of anticrisis risk-management amid macroeconomic instability and appearance of new risks together with close interest to the problem of risk-aggregation. The paper outlines the notion of stress-testing and gives coverage of goals, functions of stress-tests and main criteria for market risk stress-testing classification. The paper also stresses special aspects of scenario analysis. Novelty of the research is explained by elaborating the programme of aggregated complex multifactor stress-testing of the portfolio risk based on scenario analysis. The paper highlights modern Russian and foreign models of stress-testing both on solo-basis and complex. The paper lays emphasis on the results of stress-testing and revaluations of positions for all three complex models: methodology of the Central Bank of stress-testing portfolio risk, model relying on correlations analysis and copula model. The models of stress-testing on solo-basis are different for each financial instrument. Parametric StressVaR model is applicable to shares and options stress-testing;model based on "Grek" indicators is used for options; for euroobligation regional factor model is used. Finally some theoretical recommendations about managing market risk of the portfolio are given.

  11. An instrumental electrode model for solving EIT forward problems.

    Science.gov (United States)

    Zhang, Weida; Li, David

    2014-10-01

    An instrumental electrode model (IEM) capable of describing the performance of electrical impedance tomography (EIT) systems in the MHz frequency range has been proposed. Compared with the commonly used Complete Electrode Model (CEM), which assumes ideal front-end interfaces, the proposed model considers the effects of non-ideal components in the front-end circuits. This introduces an extra boundary condition in the forward model and offers a more accurate modelling for EIT systems. We have demonstrated its performance using simple geometry structures and compared the results with the CEM and full Maxwell methods. The IEM can provide a significantly more accurate approximation than the CEM in the MHz frequency range, where the full Maxwell methods are favoured over the quasi-static approximation. The improved electrode model will facilitate the future characterization and front-end design of real-world EIT systems.

  12. Foundations of compositional model theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2011-01-01

    Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf

  13. Design, Modelling and Teleoperation of a 2 mm Diameter Compliant Instrument for the da Vinci Platform.

    Science.gov (United States)

    Francis, P; Eastwood, K W; Bodani, V; Looi, T; Drake, J M

    2018-05-07

    This work explores the feasibility of creating and accurately controlling an instrument for robotic surgery with a 2 mm diameter and a three degree-of-freedom (DoF) wrist which is compatible with the da Vinci platform. The instrument's wrist is composed of a two DoF bending notched-nitinol tube pattern, for which a kinematic model has been developed. A base mechanism for controlling the wrist is designed for integration with the da Vinci Research Kit. A basic teleoperation task is successfully performed using two of the miniature instruments. The performance and accuracy of the instrument suggest that creating and accurately controlling a 2 mm diameter instrument is feasible and the design and modelling proposed in this work provide a basis for future miniature instrument development.

  14. Robust global identifiability theory using potentials--Application to compartmental models.

    Science.gov (United States)

    Wongvanich, N; Hann, C E; Sirisena, H R

    2015-04-01

    This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Theories of conduct disorder: a causal modelling analysis

    NARCIS (Netherlands)

    Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De

    2004-01-01

    Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –

  16. Modelling non-ignorable missing data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cornelis A.W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  17. Modelling non-ignorable missing-data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cees A. W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  18. Nematic elastomers: from a microscopic model to macroscopic elasticity theory.

    Science.gov (United States)

    Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette

    2008-05-01

    A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.

  19. Instrumentation and testing of a prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Hessheimer, M.F.; Pace, D.W.; Klamerus, E.W.

    1997-01-01

    Static overpressurization tests of two scale models of nuclear containment structures - a steel containment vessel (SCV) representative of an improved, boiling water reactor (BWR) Mark II design and a prestressed concrete containment vessel (PCCV) for pressurized water reactors (PWR) - are being conducted by Sandia National Laboratories for the Nuclear Power Engineering Corporation of Japan and the U.S. Nuclear Regulatory Commission. This paper discusses plans for instrumentation and testing of the PCCV model. 6 refs., 2 figs., 2 tabs

  20. Spatial interaction models facility location using game theory

    CERN Document Server

    D'Amato, Egidio; Pardalos, Panos

    2017-01-01

    Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.

  1. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    Science.gov (United States)

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Results of the first tests of the SIDRA satellite-borne instrument breadboard model

    International Nuclear Information System (INIS)

    Dudnik, O.V.; Kurbatov, E.V.; Avilov, A.M.; Titov, K.G.; Prieto, M; Sanchez, S.; Spassky, A.V.; Sylwester, J.; Gburek, S.; Podgorski, P.

    2013-01-01

    In this work, the results of the calibration of the solid-state detectors and electronic channels of the SIDRA satellite borne energetic charged particle spectrometer-telescope breadboard model are presented. The block schemes and experimental equipment used to conduct the thermal vacuum and electromagnetic compatibility tests of the assemblies and modules of the compact satellite equipment are described. The results of the measured thermal conditions of operation of the signal analog and digital processing critical modules of the SIDRA instrument prototype are discussed. Finally, the levels of conducted interference generated by the instrument model in the primary vehicle-borne power circuits are presented.

  3. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  4. An anthology of theories and models of design philosophy, approaches and empirical explorations

    CERN Document Server

    Blessing, Lucienne

    2014-01-01

    While investigations into both theories and models has remained a major strand of engineering design research, current literature sorely lacks a reference book that provides a comprehensive and up-to-date anthology of theories and models, and their philosophical and empirical underpinnings; An Anthology of Theories and Models of Design fills this gap. The text collects the expert views of an international authorship, covering: ·         significant theories in engineering design, including CK theory, domain theory, and the theory of technical systems; ·         current models of design, from a function behavior structure model to an integrated model; ·         important empirical research findings from studies into design; and ·         philosophical underpinnings of design itself. For educators and researchers in engineering design, An Anthology of Theories and Models of Design gives access to in-depth coverage of theoretical and empirical developments in this area; for pr...

  5. The sound of oscillating air jets: Physics, modeling and simulation in flute-like instruments

    Science.gov (United States)

    de La Cuadra, Patricio

    Flute-like instruments share a common mechanism that consists of blowing across one open end of a resonator to produce an air jet that is directed towards a sharp edge. Analysis of its operation involves various research fields including fluid dynamics, aero-acoustics, and physics. An effort has been made in this study to extend this description from instruments with fixed geometry like recorders and organ pipes to flutes played by the lips. An analysis of the jet's response to a periodic excitation is the focus of this study, as are the parameters under the player's control in forming the jet. The jet is excited with a controlled excitation consisting of two loudspeakers in opposite phase. A Schlieren system is used to visualize the jet, and image detection algorithms are developed to extract quantitative information from the images. In order to study the behavior of jets observed in different flute-like instruments, several geometries of the excitation and jet shapes are studied. The obtained data is used to propose analytical models that correctly fit the observed measurements and can be used for simulations. The control exerted by the performer on the instrument is of crucial importance in the quality of the sound produced for a number of flute-like instruments. The case of the transverse flute is experimentally studied. An ensemble of control parameters are measured and visualized in order to describe some aspects of the subtle control attained by an experienced flautist. Contrasting data from a novice flautist are compared. As a result, typical values for several non-dimensional parameters that characterize the normal operation of the instrument have been measured, and data to feed simulations has been collected. The information obtained through experimentation is combined with research developed over the last decades to put together a time-domain simulation. The model proposed is one-dimensional and driven by a single physical input. All the variables in the

  6. An integrative neural model of social perception, action observation, and theory of mind

    Science.gov (United States)

    Yang, Daniel Y.-J.; Rosenblau, Gabriela; Keifer, Cara; Pelphrey, Kevin A.

    2016-01-01

    In the field of social neuroscience, major branches of research have been instrumental in describing independent components of typical and aberrant social information processing, but the field as a whole lacks a comprehensive model that integrates different branches. We review existing research related to the neural basis of three key neural systems underlying social information processing: social perception, action observation, and theory of mind. We propose an integrative model that unites these three processes and highlights the posterior superior temporal sulcus (pSTS), which plays a central role in all three systems. Furthermore, we integrate these neural systems with the dual system account of implicit and explicit social information processing. Large-scale meta-analyses based on Neurosynth confirmed that the pSTS is at the intersection of the three neural systems. Resting-state functional connectivity analysis with 1000 subjects confirmed that the pSTS is connected to all other regions in these systems. The findings presented in this review are specifically relevant for psychiatric research especially disorders characterized by social deficits such as autism spectrum disorder. PMID:25660957

  7. Spin foam model for pure gauge theory coupled to quantum gravity

    International Nuclear Information System (INIS)

    Oriti, Daniele; Pfeiffer, Hendryk

    2002-01-01

    We propose a spin foam model for pure gauge fields coupled to Riemannian quantum gravity in four dimensions. The model is formulated for the triangulation of a four-manifold which is given merely combinatorially. The Riemannian Barrett-Crane model provides the gravity sector of our model and dynamically assigns geometric data to the given combinatorial triangulation. The gauge theory sector is a lattice gauge theory living on the same triangulation and obtains from the gravity sector the geometric information which is required to calculate the Yang-Mills action. The model is designed so that one obtains a continuum approximation of the gauge theory sector at an effective level, similarly to the continuum limit of lattice gauge theory, when the typical length scale of gravity is much smaller than the Yang-Mills scale

  8. A model of PCF in guarded type theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  9. A Model of PCF in Guarded Type Theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...

  10. Game Theory and its Relationship with Linear Programming Models ...

    African Journals Online (AJOL)

    Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.

  11. sigma model approach to the heterotic string theory

    International Nuclear Information System (INIS)

    Sen, A.

    1985-09-01

    Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs

  12. Strategies to Enhance Online Learning Teams. Team Assessment and Diagnostics Instrument and Agent-based Modeling

    Science.gov (United States)

    2010-08-12

    Strategies to Enhance Online Learning Teams Team Assessment and Diagnostics Instrument and Agent-based Modeling Tristan E. Johnson, Ph.D. Learning ...REPORT DATE AUG 2010 2. REPORT TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND SUBTITLE Strategies to Enhance Online Learning ...TeamsTeam Strategies to Enhance Online Learning Teams: Team Assessment and Diagnostics Instrument and Agent-based Modeling 5a. CONTRACT NUMBER 5b. GRANT

  13. INSTRUMENTALISM IN SCIENCE: COMMENTS AND CRITICISMS

    African Journals Online (AJOL)

    Admin

    that guide the scientist in making his decisions or a perceived system of procedural rules. ... to science, information and theories than an ... instrumentalists try to provide the foundation of ..... instrumentalism, which are practical rather than.

  14. The logical foundations of scientific theories languages, structures, and models

    CERN Document Server

    Krause, Decio

    2016-01-01

    This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...

  15. Analysis of the stochastic channel model by Saleh & Valenzuela via the theory of point processes

    DEFF Research Database (Denmark)

    Jakobsen, Morten Lomholt; Pedersen, Troels; Fleury, Bernard Henri

    2012-01-01

    and underlying features, like the intensity function of the component delays and the delaypower intensity. The flexibility and clarity of the mathematical instruments utilized to obtain these results lead us to conjecture that the theory of spatial point processes provides a unifying mathematical framework...

  16. A learning theory account of depression.

    Science.gov (United States)

    Ramnerö, Jonas; Folke, Fredrik; Kanter, Jonathan W

    2015-06-11

    Learning theory provides a foundation for understanding and deriving treatment principles for impacting a spectrum of functional processes relevant to the construct of depression. While behavioral interventions have been commonplace in the cognitive behavioral tradition, most often conceptualized within a cognitive theoretical framework, recent years have seen renewed interest in more purely behavioral models. These modern learning theory accounts of depression focus on the interchange between behavior and the environment, mainly in terms of lack of reinforcement, extinction of instrumental behavior, and excesses of aversive control, and include a conceptualization of relevant cognitive and emotional variables. These positions, drawn from extensive basic and applied research, cohere with biological theories on reduced reward learning and reward responsiveness and views of depression as a heterogeneous, complex set of disorders. Treatment techniques based on learning theory, often labeled Behavioral Activation (BA) focus on activating the individual in directions that increase contact with potential reinforcers, as defined ideographically with the client. BA is considered an empirically well-established treatment that generalizes well across diverse contexts and populations. The learning theory account is discussed in terms of being a parsimonious model and ground for treatments highly suitable for large scale dissemination. © 2015 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  17. Planar N = 4 gauge theory and the Hubbard model

    International Nuclear Information System (INIS)

    Rej, Adam; Serban, Didina; Staudacher, Matthias

    2006-01-01

    Recently it was established that a certain integrable long-range spin chain describes the dilatation operator of N = 4 gauge theory in the su(2) sector to at least three-loop order, while exhibiting BMN scaling to all orders in perturbation theory. Here we identify this spin chain as an approximation to an integrable short-ranged model of strongly correlated electrons: The Hubbard model

  18. Psychometric Properties of an Instrument Derived from the Intentional Relationship Model: The SelfEfficacy for Recognizing Clients’ Interpersonal Characteristics (N-SERIC

    Directory of Open Access Journals (Sweden)

    Victoria C. Ritter

    2018-04-01

    Full Text Available Background: The Intentional Relationship Model conceptualizes the therapeutic use of self in occupational therapy. To increase motivation for and success in establishing therapeutic relationships, therapists need self-efficacy for using the self in therapeutic practice. However, attempts to combine this model with self-efficacy theory are rare, and instruments by which to measure self-efficacy for therapeutic use of self are in a developing stage. This study aimed to examine the factor structure and internal consistency of the Norwegian Self-Efficacy for Recognizing Interpersonal Characteristics (N-SERIC. Methods: Occupational therapy students (n = 100 from two education programs completed the instrument and sociodemographic information. The factor structure was examined with Principal Components Analysis (PCA, and internal consistency was assessed with Cronbach’s α and inter-item correlations. Results: The PCA revealed that all N-SERIC items belonged to the same latent factor, with factor loadings ranging between 0.75 and 0.89. The internal consistency of the scale items was high (Cronbach’s α = 0.96. Conclusions: The N-SERIC scale is unidimensional and the items have very high internal consistency. Thus, the scale sum score can be useful for occupational therapy research and audits focusing on interpersonal aspects of practice.

  19. STAKEHOLDER THEORY DAN KARYA KESELAMATAN SCHINDLER

    Directory of Open Access Journals (Sweden)

    Edward Nicodemus Lontah

    2015-04-01

    Donaldson and Peterson studies have shown that stakeholder theory has a more solid foundation than the epistemology of shareholder theory to analyze the performance of business ethics and moral duty of a company. This article discussed the business activities of Oskar Schindler, an industrialist war-profiteer during World War II. Schindler's business which was originally run by the government under the Nazi regime, eventually opposed the mission of economic and legal liability imposed by the regime. Schindler's transformation of vision and business mission in this article demonstrate the characteristics and connection of layers in descriptive, instrumental and normative stakeholder theory in the concept of "normative, instrumental and descriptive stakeholder theory" according to Donaldson and Peterson.

  20. Models with oscillator terms in noncommutative quantum field theory

    International Nuclear Information System (INIS)

    Kronberger, E.

    2010-01-01

    The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de

  1. A Practitioner's Instrument for Measuring Secondary Mathematics Teachers' Beliefs Surrounding Learner-Centered Classroom Practice.

    Science.gov (United States)

    Lischka, Alyson E; Garner, Mary

    In this paper we present the development and validation of a Mathematics Teaching Pedagogical and Discourse Beliefs Instrument (MTPDBI), a 20 item partial-credit survey designed and analyzed using Rasch measurement theory. Items on the MTPDBI address beliefs about the nature of mathematics, teaching and learning mathematics, and classroom discourse practices. A Rasch partial credit model (Masters, 1982) was estimated from the pilot study data. Results show that item separation reliability is .96 and person separation reliability is .71. Other analyses indicate the instrument is a viable measure of secondary teachers' beliefs about reform-oriented mathematics teaching and learning. This instrument is proposed as a useful measure of teacher beliefs for those working with pre-service and in-service teacher development.

  2. Integrable lambda models and Chern-Simons theories

    International Nuclear Information System (INIS)

    Schmidtt, David M.

    2017-01-01

    In this note we reveal a connection between the phase space of lambda models on S 1 ×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 ×S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  3. Models and error analyses of measuring instruments in accountability systems in safeguards control

    International Nuclear Information System (INIS)

    Dattatreya, E.S.

    1977-05-01

    Essentially three types of measuring instruments are used in plutonium accountability systems: (1) the bubblers, for measuring the total volume of liquid in the holding tanks, (2) coulometers, titration apparatus and calorimeters, for measuring the concentration of plutonium; and (3) spectrometers, for measuring isotopic composition. These three classes of instruments are modeled and analyzed. Finally, the uncertainty in the estimation of total plutonium in the holding tank is determined

  4. Laser Speckle Contrast Imaging: theory, instrumentation and applications.

    Science.gov (United States)

    Senarathna, Janaka; Rege, Abhishek; Li, Nan; Thakor, Nitish V

    2013-01-01

    Laser Speckle Contrast Imaging (LSCI) is a wide field of view, non scanning optical technique for observing blood flow. Speckles are produced when coherent light scattered back from biological tissue is diffracted through the limiting aperture of focusing optics. Mobile scatterers cause the speckle pattern to blur; a model can be constructed by inversely relating the degree of blur, termed speckle contrast to the scatterer speed. In tissue, red blood cells are the main source of moving scatterers. Therefore, blood flow acts as a virtual contrast agent, outlining blood vessels. The spatial resolution (~10 μm) and temporal resolution (10 ms to 10 s) of LSCI can be tailored to the application. Restricted by the penetration depth of light, LSCI can only visualize superficial blood flow. Additionally, due to its non scanning nature, LSCI is unable to provide depth resolved images. The simple setup and non-dependence on exogenous contrast agents have made LSCI a popular tool for studying vascular structure and blood flow dynamics. We discuss the theory and practice of LSCI and critically analyze its merit in major areas of application such as retinal imaging, imaging of skin perfusion as well as imaging of neurophysiology.

  5. Stochastic quantization of field theories on the lattice and supersymmetrical models

    International Nuclear Information System (INIS)

    Aldazabal, Gerardo.

    1984-01-01

    Several aspects of the stochastic quantization method are considered. Specifically, field theories on the lattice and supersymmetrical models are studied. A non-linear sigma model is studied firstly, and it is shown that it is possible to obtain evolution equations written directly for invariant quantities. These ideas are generalized to obtain Langevin equations for the Wilson loops of non-abelian lattice gauge theories U (N) and SU (N). In order to write these equations, some different ways of introducing the constraints which the fields must satisfy are discussed. It is natural to have a strong coupling expansion in these equations. The correspondence with quantum field theory is established, and it is noticed that at all orders in the perturbation theory, Langevin equations reduce to Schwinger-Dyson equations. From another point of view, stochastic quantization is applied to large N matrix models on the lattice. As a result, a simple and systematic way of building reduced models is found. Referring to stochastic quantization in supersymmetric theories, a simple supersymmetric model is studied. It is shown that it is possible to write an evolution equation for the superfield wich leads to quantum field theory results in equilibrium. As the Langevin equation preserves supersymmetry, the property of dimensional reduction known for the quantum model is shown to be valid at all times. (M.E.L.) [es

  6. Spin foam models of Yang-Mills theory coupled to gravity

    International Nuclear Information System (INIS)

    Mikovic, A

    2003-01-01

    We construct a spin foam model of Yang-Mills theory coupled to gravity by using a discretized path integral of the BF theory with polynomial interactions and the Barrett-Crane ansatz. In the Euclidean gravity case, we obtain a vertex amplitude which is determined by a vertex operator acting on a simple spin network function. The Euclidean gravity results can be straightforwardly extended to the Lorentzian case, so that we propose a Lorentzian spin foam model of Yang-Mills theory coupled to gravity

  7. Working memory: theories, models, and controversies.

    Science.gov (United States)

    Baddeley, Alan

    2012-01-01

    I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.

  8. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  9. Goal-directed behaviour and instrumental devaluation: a neural system-level computational model

    Directory of Open Access Journals (Sweden)

    Francesco Mannella

    2016-10-01

    Full Text Available Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviours guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers activate the representation of rewards (or `action-outcomes', e.g. foods while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods. The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b the three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and integrates the results of different devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behaviour.

  10. Auto-associative Kernel Regression Model with Weighted Distance Metric for Instrument Drift Monitoring

    International Nuclear Information System (INIS)

    Shin, Ho Cheol; Park, Moon Ghu; You, Skin

    2006-01-01

    Recently, many on-line approaches to instrument channel surveillance (drift monitoring and fault detection) have been reported worldwide. On-line monitoring (OLM) method evaluates instrument channel performance by assessing its consistency with other plant indications through parametric or non-parametric models. The heart of an OLM system is the model giving an estimate of the true process parameter value against individual measurements. This model gives process parameter estimate calculated as a function of other plant measurements which can be used to identify small sensor drifts that would require the sensor to be manually calibrated or replaced. This paper describes an improvement of auto associative kernel regression (AAKR) by introducing a correlation coefficient weighting on kernel distances. The prediction performance of the developed method is compared with conventional auto-associative kernel regression

  11. Matrix models and stochastic growth in Donaldson-Thomas theory

    Energy Technology Data Exchange (ETDEWEB)

    Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)

    2012-10-15

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  12. Matrix models and stochastic growth in Donaldson-Thomas theory

    International Nuclear Information System (INIS)

    Szabo, Richard J.; Tierz, Miguel

    2012-01-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  13. Soliton excitations in polyacetylene and relativistic field theory models

    International Nuclear Information System (INIS)

    Campbell, D.K.; Bishop, A.R.; Los Alamos Scientific Lab., NM

    1982-01-01

    A continuum model of a Peierls-dimerized chain, as described generally by Brazovskii and discussed for the case of polyacetylene by Takayama, Lin-Liu and Maki (TLM), is considered. The continuum (Bogliubov-de Gennes) equations arising in this model of interacting electrons and phonons are shown to be equivalent to the static, semiclassical equations for a solvable model field theory of self-coupled fermions - the N = 2 Gross-Neveu model. Based on this equivalence we note the existence of soliton defect states in polyacetylene that are additional to, and qualitatively different from, the amplitude kinks commonly discussed. The new solutions do not have the topological stability of kinks but are essentially conventional strong-coupling polarons in the dimerized chain. They carry spin (1/2) and charge (+- e). In addition, we discuss further areas in which known field theory results may apply to a Peierls-dimerized chain, including relations between phenomenological PHI 4 and continuuum electron-phonon models, and the structure of the fully quantum versus mean field theories. (orig.)

  14. A survey on the modeling and applications of cellular automata theory

    Science.gov (United States)

    Gong, Yimin

    2017-09-01

    The Cellular Automata Theory is a discrete model which is now widely used in scientific researches and simulations. The model is comprised of some cells which changes according to a specific rule over time. This paper provides a survey of the Modeling and Applications of Cellular Automata Theory, which focus on the program realization of Cellular Automata Theory and the application of Cellular Automata in each field, such as road traffic, land use, and cutting machines. Each application is further explained, and several related main models are briefly introduced. This research aims to help decision-makers formulate appropriate development plans.

  15. Introduction to zeolite theory and modelling

    NARCIS (Netherlands)

    Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.

    2001-01-01

    A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the

  16. Integrable models in 1+1 dimensional quantum field theory

    International Nuclear Information System (INIS)

    Faddeev, Ludvig.

    1982-09-01

    The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR

  17. Development and validation of the Brazilian version of the Attitudes to Aging Questionnaire (AAQ: An example of merging classical psychometric theory and the Rasch measurement model

    Directory of Open Access Journals (Sweden)

    Trentini Clarissa M

    2008-01-01

    Full Text Available Abstract Background Aging has determined a demographic shift in the world, which is considered a major societal achievement, and a challenge. Aging is primarily a subjective experience, shaped by factors such as gender and culture. There is a lack of instruments to assess attitudes to aging adequately. In addition, there is no instrument developed or validated in developing region contexts, so that the particularities of ageing in these areas are not included in the measures available. This paper aims to develop and validate a reliable attitude to aging instrument by combining classical psychometric approach and Rasch analysis. Methods Pilot study and field trial are described in details. Statistical analysis included classic psychometric theory (EFA and CFA and Rasch measurement model. The latter was applied to examine unidimensionality, response scale and item fit. Results Sample was composed of 424 Brazilian old adults, which was compared to an international sample (n = 5238. The final instrument shows excellent psychometric performance (discriminant validity, confirmatory factor analysis and Rasch fit statistics. Rasch analysis indicated that modifications in the response scale and item deletions improved the initial solution derived from the classic approach. Conclusion The combination of classic and modern psychometric theories in a complementary way is fruitful for development and validation of instruments. The construction of a reliable Brazilian Attitudes to Aging Questionnaire is important for assessing cultural specificities of aging in a transcultural perspective and can be applied in international cross-cultural investigations running less risk of cultural bias.

  18. Integrable lambda models and Chern-Simons theories

    Energy Technology Data Exchange (ETDEWEB)

    Schmidtt, David M. [Departamento de Física, Universidade Federal de São Carlos,Caixa Postal 676, CEP 13565-905, São Carlos-SP (Brazil)

    2017-05-03

    In this note we reveal a connection between the phase space of lambda models on S{sup 1}×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS{sub 5}×S{sup 5} lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.

  19. A dynamical theory for the Rishon model

    International Nuclear Information System (INIS)

    Harari, H.; Seiberg, N.

    1980-09-01

    We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)

  20. Self Modeling: Expanding the Theories of Learning

    Science.gov (United States)

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  1. ECONOMIC GROWTH THEORIES, CONCEPTUAL ELEMENTS, CHARACTERISTICS

    Directory of Open Access Journals (Sweden)

    Florina, POPA

    2014-11-01

    Full Text Available The approach of economic growth involves understanding the concept and growth factors, respectively, analysing the growth theories, their trend in the context of the development of economic and social life. The economic growth signifies a process aimed at increasing activities in the national economy, expressed by macroeconomic indicators, respectively, the dynamics of the overall Gross Domestic Product or per inhabitant. It can appreciate that, in the short term, this process signifies phases of economic prosperity and on the long-term, expresses an upward trend, a consequence of the succession of increases and decreases. The study presents some elements which outlines the concept of economic growth, that is, definitions, meanings and the main characteristics of the theories of growth, as well as some of its determinant factors. Also, it gives a brief overview of the main theories of economic growth, as they have evolved over time, in line with the economic reality dynamics and the development of the instruments of economic analysis, starting from the classical theories to the new theories and models of economic growth of the modern age.

  2. Membrane models and generalized Z2 gauge theories

    International Nuclear Information System (INIS)

    Lowe, M.J.; Wallace, D.J.

    1980-01-01

    We consider models of (d-n)-dimensional membranes fluctuating in a d-dimensional space under the action of surface tension. We investigate the renormalization properties of these models perturbatively and in 1/n expansion. The potential relationships of these models to generalized Z 2 gauge theories are indicated. (orig.)

  3. PENGEMBANGAN MODEL GAME THEORY PADA SKEMA PERSEDIAAN PENYANGGA UNTUK MENJAMIN KETERSEDIAAN DAN KESTABILAN HARGA KOMODITAS GULA PASIR

    Directory of Open Access Journals (Sweden)

    Mahesa Jenar

    2015-06-01

    Full Text Available Bahan pokok yang setiap hari dikonsumsi masyarakat Indonesia antara lain beras, gula pasir, dan minyak goreng. Bahan pokok tersebut merupakan hasil dari sektor pertanian atau komoditas pertanian. Komoditas pokok tersebut dikonsumsi oleh masyarakat Indonesia sepanjang tahun dalam jumlah yang sangat besar. Karena komoditas pokok diperlukan sepanjang tahun, ketersediaan terhadap komoditas pokok merupakan hal yang sangat penting bagi masyarakat Indonesia. Dalam rangka menjaga ketahanan pangan, salah satu cara untuk mengatasi permasalahan tersebut dengan menyediakan persediaan penyangga sebagai instrumen untuk mengendalikan keseimbangan pasokan dan permintaan pasar. Baik produsen, pedagang, dan konsumen menggunakan strategi masing-masing yang bersesuaian dengan kepentingan tersebut. Untuk itu diperlukan suatu pendekatan matematis yang dapat menganalisa proses pengambilan keputusan yang melibatkan dua atau lebih kepentingan. Dalam penelitian ini akan digunakan game theory untuk menganalisa pengambilan keputusan dari kepentingan yang terkait. Hasilnya menunjukkan bahwa model game theory dapat menggambarkan hubungan transaksi antara produsen dan konsumen dalam skema persediaan penyangga. Selain itu, game theory dapat menggambarkan beberapa kondisi dalam skema persediaan penyangga melalui strategi yang dikembangkan.     Abstract Indonesian society consumpts some staple commodities such as rice, sugar, and cooking oil. Staple commodities are from agricultural commodities. Staple commodities consumed by the people of Indonesia during the year in a very large number. Because the staple commodities required throughout the year, the availability of the staple commodities is very important for the people of Indonesia. Food security can be ensured by providing a buffer stock as an instrument for controlling the balance of supply and demand. Both producers, traders, and consumers are using their each strategy that corresponding to these interests. It required

  4. Integrability of a family of quantum field theories related to sigma models

    Energy Technology Data Exchange (ETDEWEB)

    Ridout, David [Australian National Univ., Canberra, ACT (Australia). Dept. of Theoretical Physics; DESY, Hamburg (Germany). Theory Group; Teschner, Joerg [DESY, Hamburg (Germany). Theory Group

    2011-03-15

    A method is introduced for constructing lattice discretizations of large classes of integrable quantum field theories. The method proceeds in two steps: The quantum algebraic structure underlying the integrability of the model is determined from the algebra of the interaction terms in the light-cone representation. The representation theory of the relevant quantum algebra is then used to construct the basic ingredients of the quantum inverse scattering method, the lattice Lax matrices and R-matrices. This method is illustrated with four examples: The Sinh-Gordon model, the affine sl(3) Toda model, a model called the fermionic sl(2 vertical stroke 1) Toda theory, and the N=2 supersymmetric Sine-Gordon model. These models are all related to sigma models in various ways. The N=2 supersymmetric Sine-Gordon model, in particular, describes the Pohlmeyer reduction of string theory on AdS{sub 2} x S{sup 2}, and is dual to a supersymmetric non-linear sigma model with a sausage-shaped target space. (orig.)

  5. Models and theories of prescribing decisions: A review and suggested a new model

    Science.gov (United States)

    Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701

  6. Models and theories of prescribing decisions: A review and suggested a new model

    Directory of Open Access Journals (Sweden)

    Ali Murshid M

    2017-06-01

    Full Text Available To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  7. Expectancy Theory Prediction of the Preference to Remain Employed or to Retire

    Science.gov (United States)

    Eran, Mordechai; Jacobson, Dan

    1976-01-01

    Vroom's expectancy theory model to predict older worker's choices between employment or retirement hypothesized that a person's preference would be a function of differences between instrumentality of employment and retirement for attainment of outcomes, multiplied by the valence of each outcome, summed over outcomes. Results supported the…

  8. Instrumentation development

    International Nuclear Information System (INIS)

    Ubbes, W.F.; Yow, J.L. Jr.

    1988-01-01

    Instrumentation is developed for the Civilian Radioactive Waste Management Program to meet several different (and sometimes conflicting) objectives. This paper addresses instrumentation development for data needs that are related either directly or indirectly to a repository site, but does not touch on instrumentation for work with waste forms or other materials. Consequently, this implies a relatively large scale for the measurements, and an in situ setting for instrument performance. In this context, instruments are needed for site characterization to define phenomena, develop models, and obtain parameter values, and for later design and performance confirmation testing in the constructed repository. The former set of applications is more immediate, and is driven by the needs of program design and performance assessment activities. A host of general technical and nontechnical issues have arisen to challenge instrumentation development. Instruments can be classed into geomechanical, geohydrologic, or other specialty categories, but these issues cut across artificial classifications. These issues are outlined. Despite this imposing list of issues, several case histories are cited to evaluate progress in the area

  9. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  10. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  11. Cassini Radar EQM Model: Instrument Description and Performance Status

    Science.gov (United States)

    Borgarelli, L.; Faustini, E. Zampolini; Im, E.; Johnson, W. T. K.

    1996-01-01

    The spaeccraft of the Cassini Mission is planned to be launched towards Saturn in October 1997. The mission is designed to study the physical structure and chemical composition of Titan. The results of the tests performed on the Cassini radar engineering qualification model (EQM) are summarized. The approach followed in the verification and evaluation of the performance of the radio frequency subsystem EQM is presented. The results show that the instrument satisfies the relevant mission requirements.

  12. Validation of a Theory of Planned Behavior-Based Questionnaire to Examine Factors Associated With Milk Expression.

    Science.gov (United States)

    Bai, Yeon K; Dinour, Lauren M

    2017-11-01

    A proper assessment of multidimensional needs for breastfeeding mothers in various settings is crucial to facilitate and support breastfeeding and its exclusivity. The theory of planned behavior (TPB) has been used frequently to measure factors associated with breastfeeding. Full utility of the TPB requires accurate measurement of theory constructs. Research aim: This study aimed to develop and confirm the psychometric properties of an instrument, Milk Expression on Campus, based on the TPB and to establish the reliability and validity of the instrument. In spring 2015, 218 breastfeeding (current or in the recent past) employees and students at one university campus in northern New Jersey completed the online questionnaire containing demography and theory-based items. Internal consistency (α) and split-half reliability ( r) tests and factor analyses established and confirmed the reliability and construct validity of this instrument. Milk Expression on Campus showed strong and significant reliabilities as a full scale (α = .78, r = .74, p theory construct subscales. Validity was confirmed as psychometric properties corresponded to the factors extracted from the scale. Four factors extracted from the direct construct subscales accounted for 79.49% of the total variability. Four distinct factors from the indirect construct subscales accounted for 73.68% of the total variability. Milk Expression on Campus can serve as a model TPB-based instrument to examine factors associated with women's milk expression behavior. The utility of this instrument extends to designing effective promotion programs to foster breastfeeding and milk expression behaviors in diverse settings.

  13. The assessment and treatment of prosodic disorders and neurological theories of prosody.

    Science.gov (United States)

    Diehl, Joshua J; Paul, Rhea

    2009-08-01

    In this article, we comment on specific aspects of Peppé (Peppé, 2009). In particular, we address the assessment and treatment of prosody in clinical settings and discuss current theory on neurological models of prosody. We argue that in order for prosodic assessment instruments and treatment programs to be clinical effective, we need assessment instruments that: (1) have a representative normative comparison sample and strong psychometric properties; (2) are based on empirical information regarding the typical sequence of prosodic acquisition and are sensitive to developmental change; (3) meaningfully subcategorize various aspects of prosody; (4) use tasks that have ecological validity; and (5) have clinical properties, such as length and ease of administration, that allow them to become part of standard language assessment batteries. In addition, we argue that current theories of prosody processing in the brain are moving toward network models that involve multiple brain areas and are crucially dependent on cortical communication. The implications of these observations for future research and clinical practice are outlined.

  14. DC servo motor positioning with anti-windup implementation using C2000 ARM-Texas Instrument

    Science.gov (United States)

    Linggarjati, Jimmy

    2017-12-01

    One of the most important topics in control system is DC Motor. At this research, a positioning control system for a DC motor is investigated. Firstly, the DC Motor will be paramaterized to get the transfer function model, in order to be simulated in Matlab, and then implemented in a C2000-ARM microcontroller from TI (Texas Instrument). With this investigation, students in control system theory will be able to understand the importance of classical control theories, in relation to the real world implementation of the position control for the DC Motor, escpecially the importance of Anti-Windup technique in real-world implementation.

  15. Use of instruments to evaluate leadership in nursing and health services.

    Science.gov (United States)

    Carrara, Gisleangela Lima Rodrigues; Bernardes, Andrea; Balsanelli, Alexandre Pazetto; Camelo, Silvia Helena Henriques; Gabriel, Carmen Silvia; Zanetti, Ariane Cristina Barboza

    2018-03-12

    To identify the available scientific evidence about the use of instruments for the evaluation of leadership in health and nursing services and verify the use of leadership styles/models/theories in the construction of these tools. Integrative literature review of indexed studies in the LILACS, PUBMED, CINAHL and EMBASE databases from 2006 to 2016. Thirty-eight articles were analyzed, exhibiting 19 leadership evaluation tools; the most used were the Multifactor Leadership Questionnaire, the Global Transformational Leadership Scale, the Leadership Practices Inventory, the Servant Leadership Questionnaire, the Servant Leadership Survey and the Authentic Leadership Questionnaire. The literature search allowed to identify the main theories/styles/models of contemporary leadership and analyze their use in the design of leadership evaluation tools, with the transformational, situational, servant and authentic leadership categories standing out as the most prominent. To a lesser extent, the quantum, charismatic and clinical leadership types were evidenced.

  16. Development of an analytical theory to describe the PNAR and CIPN nondestructive assay techniques

    International Nuclear Information System (INIS)

    Bolind, Alan Michael

    2014-01-01

    Highlights: • Neutron albedo is modeled by a discrete and iterative reflection process. • The theory enables the PNAR and CIPN NDA techniques to be compared quantitatively. • Improvements to the data analysis and to the CIPN instrument design are suggested. • A correction to translate real no-reflection PNAR data into ideal data is provided. - Abstract: This paper develops an analytical theory to describe how neutron albedo (reflection) increases the multiplication of neutrons by a used fuel assembly. With this theory, the two nondestructive assay (NDA) techniques of Passive Neutron Albedo Reactivity (PNAR) and Californium-252 Interrogation with Prompt Neutron Detection (CIPN) can be compared directly. Specifically, the theory derives expressions for the PNAR and CIPN metrics in terms of the physical properties of the used fuel assembly, such as the neutron multiplications and fate probabilities. The theory thus clarifies the interpretation of these two NDA techniques and suggests ways to improve both the design of the NDA instruments and the algorithms for analyzing the measurement results

  17. Superfield theory and supermatrix model

    International Nuclear Information System (INIS)

    Park, Jeong-Hyuck

    2003-01-01

    We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)

  18. σ-models and string theories

    International Nuclear Information System (INIS)

    Randjbar-Daemi, S.

    1987-01-01

    The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix

  19. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...... ante (“hidden characteristics”) as well as ex post information asymmetry (“hidden action”), and examines conditions under which various kinds of incentive instruments and monitoring arrangements can be deployed to minimize the welfare loss. Its clear predictions and broad applicability have allowed...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  20. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    2015-01-01

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex...... ante (‘hidden characteristics’) as well as ex post information asymmetry (‘hidden action’), and examines conditions under which various kinds of incentive instruments and monitoring arrangements can be deployed to minimize the welfare loss. Its clear predictions and broad applicability have allowed...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  1. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D

    2014-01-01

    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  2. Understanding and Measuring Evaluation Capacity: A Model and Instrument Validation Study

    Science.gov (United States)

    Taylor-Ritzler, Tina; Suarez-Balcazar, Yolanda; Garcia-Iriarte, Edurne; Henry, David B.; Balcazar, Fabricio E.

    2013-01-01

    This study describes the development and validation of the Evaluation Capacity Assessment Instrument (ECAI), a measure designed to assess evaluation capacity among staff of nonprofit organizations that is based on a synthesis model of evaluation capacity. One hundred and sixty-nine staff of nonprofit organizations completed the ECAI. The 68-item…

  3. A signal detection-item response theory model for evaluating neuropsychological measures.

    Science.gov (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G

    2018-02-05

    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  4. Assessing health status and quality-of-life instruments: attributes and review criteria.

    Science.gov (United States)

    Aaronson, Neil; Alonso, Jordi; Burnam, Audrey; Lohr, Kathleen N; Patrick, Donald L; Perrin, Edward; Stein, Ruth E

    2002-05-01

    The field of health status and quality of life (QoL) measurement - as a formal discipline with a cohesive theoretical framework, accepted methods, and diverse applications--has been evolving for the better part of 30 years. To identify health status and QoL instruments and review them against rigorous criteria as a precursor to creating an instrument library for later dissemination, the Medical Outcomes Trust in 1994 created an independently functioning Scientific Advisory Committee (SAC). In the mid-1990s, the SAC defined a set of attributes and criteria to carry out instrument assessments; 5 years later, it updated and revised these materials to take account of the expanding theories and technologies upon which such instruments were being developed. This paper offers the SAC's current conceptualization of eight key attributes of health status and QoL instruments (i.e., conceptual and measurement model; reliability; validity; responsiveness; interpretability; respondent and administrative burden; alternate forms; and cultural and language adaptations) and the criteria by which instruments would be reviewed on each of those attributes. These are suggested guidelines for the field to consider and debate; as measurement techniques become both more familiar and more sophisticated, we expect that experts will wish to update and refine these criteria accordingly.

  5. Diagrammatic group theory in quark models

    International Nuclear Information System (INIS)

    Canning, G.P.

    1977-05-01

    A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de

  6. Chaos Theory as a Model for Managing Issues and Crises.

    Science.gov (United States)

    Murphy, Priscilla

    1996-01-01

    Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…

  7. Discrete state moduli of string theory from c=1 matrix model

    CERN Document Server

    Dhar, A; Wadia, S R; Dhar, Avinash; Mandal, Gautam; Wadia, Spenta R

    1995-01-01

    We propose a new formulation of the space-time interpretation of the c=1 matrix model. Our formulation uses the well-known leg-pole factor that relates the matrix model amplitudes to that of the 2-dimensional string theory, but includes fluctuations around the fermi vacuum on {\\sl both sides} of the inverted harmonic oscillator potential of the double-scaled model, even when the fluctuations are small and confined entirely within the asymptotes in the phase plane. We argue that including fluctuations on both sides of the potential is essential for a consistent interpretation of the leg-pole transformed theory as a theory of space-time gravity. We reproduce the known results for the string theory tree level scattering amplitudes for flat space and linear dilaton background as a special case. We show that the generic case corresponds to more general space-time backgrounds. In particular, we identify the parameter corresponding to background metric perturbation in string theory (black hole mass) in terms of the ...

  8. Nonperturbative type IIB model building in the F-theory framework

    Energy Technology Data Exchange (ETDEWEB)

    Jurke, Benjamin Helmut Friedrich

    2011-02-28

    -realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  9. Nonperturbative type IIB model building in the F-theory framework

    International Nuclear Information System (INIS)

    Jurke, Benjamin Helmut Friedrich

    2011-01-01

    -realistic unified model building. An important aspect is the proper handling of the gauge flux on the 7-branes. Via the spectral cover description - which at first requires further refinements - chiral matter can be generated and the unified gauge group can be broken to the Standard Model. Ultimately, in this thesis an explicit unified model based on the gauge group SU(5) is constructed within the F-theory framework, such that an acceptable phenomenology and the observed three chiral matter generations are obtained. (orig.)

  10. Cohomological gauge theory, quiver matrix models and Donaldson-Thomas theoryCohomological gauge theory, quiver matrix models and Donaldson-Thomas theory

    NARCIS (Netherlands)

    Cirafici, M.; Sinkovics, A.; Szabo, R.J.

    2009-01-01

    We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques

  11. Applications of generalizability theory and their relations to classical test theory and structural equation modeling.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-03-01

    Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. The monster sporadic group and a theory underlying superstring models

    International Nuclear Information System (INIS)

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs

  13. Effective potential in Lorentz-breaking field theory models

    Energy Technology Data Exchange (ETDEWEB)

    Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)

    2017-12-15

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  14. Effective potential in Lorentz-breaking field theory models

    International Nuclear Information System (INIS)

    Baeta Scarpelli, A.P.; Brito, L.C.T.; Felipe, J.C.C.; Nascimento, J.R.; Petrov, A.Yu.

    2017-01-01

    We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)

  15. An integrative neural model of social perception, action observation, and theory of mind.

    Science.gov (United States)

    Yang, Daniel Y-J; Rosenblau, Gabriela; Keifer, Cara; Pelphrey, Kevin A

    2015-04-01

    In the field of social neuroscience, major branches of research have been instrumental in describing independent components of typical and aberrant social information processing, but the field as a whole lacks a comprehensive model that integrates different branches. We review existing research related to the neural basis of three key neural systems underlying social information processing: social perception, action observation, and theory of mind. We propose an integrative model that unites these three processes and highlights the posterior superior temporal sulcus (pSTS), which plays a central role in all three systems. Furthermore, we integrate these neural systems with the dual system account of implicit and explicit social information processing. Large-scale meta-analyses based on Neurosynth confirmed that the pSTS is at the intersection of the three neural systems. Resting-state functional connectivity analysis with 1000 subjects confirmed that the pSTS is connected to all other regions in these systems. The findings presented in this review are specifically relevant for psychiatric research especially disorders characterized by social deficits such as autism spectrum disorder. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Noncommutative gauge theory and symmetry breaking in matrix models

    International Nuclear Information System (INIS)

    Grosse, Harald; Steinacker, Harold; Lizzi, Fedele

    2010-01-01

    We show how the fields and particles of the standard model can be naturally realized in noncommutative gauge theory. Starting with a Yang-Mills matrix model in more than four dimensions, an SU(n) gauge theory on a Moyal-Weyl space arises with all matter and fields in the adjoint of the gauge group. We show how this gauge symmetry can be broken spontaneously down to SU(3) c xSU(2) L xU(1) Q [resp. SU(3) c xU(1) Q ], which couples appropriately to all fields in the standard model. An additional U(1) B gauge group arises which is anomalous at low energies, while the trace-U(1) sector is understood in terms of emergent gravity. A number of additional fields arise, which we assume to be massive, in a pattern that is reminiscent of supersymmetry. The symmetry breaking might arise via spontaneously generated fuzzy spheres, in which case the mechanism is similar to brane constructions in string theory.

  17. Off-critical statistical models: factorized scattering theories and bootstrap program

    International Nuclear Information System (INIS)

    Mussardo, G.

    1992-01-01

    We analyze those integrable statistical systems which originate from some relevant perturbations of the minimal models of conformal field theories. When only massive excitations are present, the systems can be efficiently characterized in terms of the relativistic scattering data. We review the general properties of the factorizable S-matrix in two dimensions with particular emphasis on the bootstrap principle. The classification program of the allowed spins of conserved currents and of the non-degenerate S-matrices is discussed and illustrated by means of some significant examples. The scattering theories of several massive perturbations of the minimal models are fully discussed. Among them are the Ising model, the tricritical Ising model, the Potts models, the series of the non-unitary minimal models M 2,2n+3 , the non-unitary model M 3,5 and the scaling limit of the polymer system. The ultraviolet limit of these massive integrable theories can be exploited by the thermodynamics Bethe ansatz, in particular the central charge of the original conformal theories can be recovered from the scattering data. We also consider the numerical method based on the so-called conformal space truncated approach which confirms the theoretical results and allows a direct measurement of the scattering data, i.e. the masses and the S-matrix of the particles in bootstrap interaction. The problem of computing the off-critical correlation functions is discussed in terms of the form-factor approach

  18. The Scientific Theory Profile: A Philosophy of Science Model for Science Teachers.

    Science.gov (United States)

    Loving, Cathleen

    The model developed for use with science teachers--called the Scientific Theory Profile--consists of placing three well-known philosophers of science on a grid, with the x-axis being their methods for judging theories (rational vs. natural) and the y-axis being their views on scientific theories representing the Truth versus mere models of what…

  19. Theories and measures of elder abuse.

    Science.gov (United States)

    Abolfathi Momtaz, Yadollah; Hamid, Tengku Aizan; Ibrahim, Rahimah

    2013-09-01

    Elder abuse is a pervasive phenomenon around the world with devastating effects on the victims. Although it is not a new phenomenon, interest in examining elder abuse is relatively new. This paper aims to provide an overview of the aetiological theories and measures of elder abuse. The paper briefly reviews theories to explain causes of elder abuse and then discusses the most commonly used measures of elder abuse. Based on the reviewed theories, it can be concluded that elder abuse is a multifactorial problem that may affect elderly people from different backgrounds and involve a wide variety of potential perpetrators, including caregivers, adult children, and partners. The review of existing measurement instruments notes that many different screening and assessment instruments have been developed to identify elders who are at risk for or are victims of abuse. However, there is a real need for more measurements of elder abuse, as the current instruments are limited in scope. © 2013 The Authors. Psychogeriatrics © 2013 Japanese Psychogeriatric Society.

  20. Development of a Symptom-Based Patient-Reported Outcome Instrument for Functional Dyspepsia: A Preliminary Conceptual Model and an Evaluation of the Adequacy of Existing Instruments.

    Science.gov (United States)

    Taylor, Fiona; Reasner, David S; Carson, Robyn T; Deal, Linda S; Foley, Catherine; Iovin, Ramon; Lundy, J Jason; Pompilus, Farrah; Shields, Alan L; Silberg, Debra G

    2016-10-01

    The aim was to document, from the perspective of the empirical literature, the primary symptoms of functional dyspepsia (FD), evaluate the extent to which existing questionnaires target those symptoms, and, finally, identify any missing evidence that would impact the questionnaires' use in regulated clinical trials to assess treatment efficacy claims intended for product labeling. A literature review was conducted to identify the primary symptoms of FD and existing symptom-based FD patient-reported outcome (PRO) instruments. Following a database search, abstracts were screened and articles were retrieved for review. The primary symptoms of FD were organized into a conceptual model and the PRO instruments were evaluated for conceptual coverage as well as compared against evidentiary requirements presented in the FDA's PRO Guidance for Industry. Fifty-six articles and 16 instruments assessing FD symptoms were reviewed. Concepts listed in the Rome III criteria for FD (n = 7), those assessed by existing FD instruments (n = 34), and symptoms reported by patients in published qualitative research (n = 6) were summarized in the FD conceptual model. Except for vomiting, all of the identified symptoms from the published qualitative research reports were also specified in the Rome III criteria. Only three of the 16 instruments, the Dyspepsia Symptom Severity Index (DSSI), Nepean Dyspepsia Index (NDI), and Short-Form Nepean Dyspepsia Index (SF-NDI), measure all seven FD symptoms defined by the Rome III criteria. Among these three, each utilizes a 2-week recall period and 5-point Likert-type scale, and had evidence of patient involvement in development. Despite their coverage, when these instruments were evaluated in light of regulatory expectations, several issues jeopardized their potential qualification for substantiation of a labeling claim. No existing PRO instruments that measured all seven symptoms adhered to the regulatory principles necessary to support product

  1. Preservice Biology Teachers' Conceptions About the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-02-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.

  2. The Self-Perception Theory vs. a Dynamic Learning Model

    OpenAIRE

    Swank, Otto H.

    2006-01-01

    Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...

  3. Optimal velocity difference model for a car-following theory

    International Nuclear Information System (INIS)

    Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.

    2011-01-01

    In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.

  4. Models versus theories as a primary carrier of nursing knowledge: A philosophical argument.

    Science.gov (United States)

    Bender, Miriam

    2018-01-01

    Theories and models are not equivalent. I argue that an orientation towards models as a primary carrier of nursing knowledge overcomes many ongoing challenges in philosophy of nursing science, including the theory-practice divide and the paradoxical pursuit of predictive theories in a discipline that is defined by process and a commitment to the non-reducibility of the health/care experience. Scientific models describe and explain the dynamics of specific phenomenon. This is distinct from theory, which is traditionally defined as propositions that explain and/or predict the world. The philosophical case has been made against theoretical universalism, showing that a theory can be true in its domain, but that no domain is universal. Subsequently, philosophers focused on scientific models argued that they do the work of defining the boundary conditions-the domain(s)-of a theory. Further analysis has shown the ways models can be constructed and function independent of theory, meaning models can comprise distinct, autonomous "carriers of scientific knowledge." Models are viewed as representations of the active dynamics, or mechanisms, of a phenomenon. Mechanisms are entities and activities organized such that they are productive of regular changes. Importantly, mechanisms are by definition not static: change may alter the mechanism and thereby alter or create entirely new phenomena. Orienting away from theory, and towards models, focuses scholarly activity on dynamics and change. This makes models arguably critical to nursing science, enabling the production of actionable knowledge about the dynamics of process and change in health/care. I briefly explore the implications for nursing-and health/care-knowledge and practice. © 2017 John Wiley & Sons Ltd.

  5. Exploring mouthfeel in model wines: Sensory-to-instrumental approaches.

    Science.gov (United States)

    Laguna, Laura; Sarkar, Anwesha; Bryant, Michael G; Beadling, Andrew R; Bartolomé, Begoña; Victoria Moreno-Arribas, M

    2017-12-01

    Wine creates a group of oral-tactile stimulations not related to taste or aroma, such as astringency or fullness; better known as mouthfeel. During wine consumption, mouthfeel is affected by ethanol content, phenolic compounds and their interactions with the oral components. Mouthfeel arises through changes in the salivary film when wine is consumed. In order to understand the role of each wine component, eight different model wines with/without ethanol (8%), glycerol (10g/L) and commercial tannins (1g/L) were described using a trained panel. Descriptive analysis techniques were used to train the panel and measure the intensity of the mouthfeel attributes. Alongside, the suitability of different instrumental techniques (rheology, particle size, tribology and microstructure, using Transmission Electron Microscopy (TEM)) to measure wine mouthfeel sensation was investigated. Panelists discriminated samples based on their tactile-related components (ethanol, glycerol and tannins) at the levels found naturally in wine. Higher scores were found for all sensory attributes in the samples containing ethanol. Sensory astringency was associated mainly with the addition of tannins to the wine model and glycerol did not seem to play a discriminating role at the levels found in red wines. Visual viscosity was correlated with instrumental viscosity (R=0.815, p=0.014). Hydrodynamic diameter of saliva showed an increase in presence of tannins (almost 2.5-3-folds). However, presence of ethanol or glycerol decreased hydrodynamic diameter. These results were related with the sensory astringency and earthiness as well as with the formation of nano-complexes as observed by TEM. Rheologically, the most viscous samples were those containing glycerol or tannins. Tribology results showed that at a boundary lubrication regime, differences in traction coefficient lubrication were due by the presence of glycerol. However, no differences in traction coefficients were observed in presence

  6. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H

    2012-01-01

    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  7. A study of the logical model of capital market complexity theories

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.

  8. Collective learning modeling based on the kinetic theory of active particles

    Science.gov (United States)

    Burini, D.; De Lillo, S.; Gibelli, L.

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.

  9. Behavioral and social sciences theories and models: are they used in unintentional injury prevention research?

    Science.gov (United States)

    Trifiletti, L B; Gielen, A C; Sleet, D A; Hopkins, K

    2005-06-01

    Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury prevention research. The authors conducted a systematic review to evaluate the published literature from 1980 to 2001 on behavioral and social science theory applications to unintentional injury prevention and control. Electronic database searches in PubMed and PsycINFO identified articles that combined behavioral and social sciences theories and models and injury causes. The authors identified some articles that examined behavioral and social science theories and models and unintentional injury topics, but found that several important theories have never been applied to unintentional injury prevention. Among the articles identified, the PRECEDE PROCEED Model was cited most frequently, followed by the Theory of Reasoned Action/Theory of Planned Behavior and Health Belief Model. When behavioral and social sciences theories and models were applied to unintentional injury topics, they were most frequently used to guide program design, implementation or develop evaluation measures; few examples of theory testing were found. Results suggest that the use of behavioral and social sciences theories and models in unintentional injury prevention research is only marginally represented in the mainstream, peer-reviewed literature. Both the fields of injury prevention and behavioral and social sciences could benefit from greater collaborative research to enhance behavioral approaches to injury control.

  10. Non-integrable quantum field theories as perturbations of certain integrable models

    International Nuclear Information System (INIS)

    Delfino, G.; Simonetti, P.

    1996-03-01

    We approach the study of non-integrable models of two-dimensional quantum field theory as perturbations of the integrable ones. By exploiting the knowledge of the exact S-matrix and Form Factors of the integrable field theories we obtain the first order corrections to the mass ratios, the vacuum energy density and the S-matrix of the non-integrable theories. As interesting applications of the formalism, we study the scaling region of the Ising model in an external magnetic field at T ∼ T c and the scaling region around the minimal model M 2 , τ . For these models, a remarkable agreement is observed between the theoretical predictions and the data extracted by a numerical diagonalization of their Hamiltonian. (author). 41 refs, 9 figs, 1 tab

  11. Using and Developing Measurement Instruments in Science Education: A Rasch Modeling Approach. Science & Engineering Education Sources

    Science.gov (United States)

    Liu, Xiufeng

    2010-01-01

    This book meets a demand in the science education community for a comprehensive and introductory measurement book in science education. It describes measurement instruments reported in refereed science education research journals, and introduces the Rasch modeling approach to developing measurement instruments in common science assessment domains,…

  12. Classical nucleation theory in the phase-field crystal model.

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  13. Classical nucleation theory in the phase-field crystal model

    Science.gov (United States)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  14. Integrated conception of hardware/software mixed systems used in nuclear instrumentation

    International Nuclear Information System (INIS)

    Dias, Ailton F.; Sorel, Yves; Akil, Mohamed

    2002-01-01

    Hardware/software codesign carries out the design of systems composed by a hardware portion, with specific components, and a software portion, with microprocessor based architecture. This paper describes the Algorithm Architecture Adequation (AAA) design methodology - originally oriented to programmable multicomponent architectures, its extension to reconfigurable circuits and its application to design and development of nuclear instrumentation systems composed by programmable and configurable circuits. AAA methodology uses an unified model to describe algorithm, architecture and implementation, based on graph theory. The great advantage of AAA methodology is the utilization of a same model from the specification to the implementation of hardware/software systems, reducing the complexity and design time. (author)

  15. Soliton excitations in a class of nonlinear field theory models

    International Nuclear Information System (INIS)

    Makhan'kov, V.G.; Fedyanin, V.K.

    1985-01-01

    Investigation results of nonlinear models of the field theory with a lagrangian are described. The theory includes models both with zero stable vacuum epsilon=1 and with condensate epsilon=-1 (of disturbed symmetry). Conditions of existence of particle-like solutions (PLS), stability of these solutions are investigated. Soliton dynamics is studied. PLS formfactors are calculated. Statistical mechanics of solitons is built and their dynamic structure factors are calculated

  16. Symmetry Breaking, Unification, and Theories Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasunori

    2009-07-31

    A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.

  17. Realization of computer-controlled CAMAC model through the technology of virtual instrument

    International Nuclear Information System (INIS)

    Le Yi; Li Cheng; Liao Juanjuan; Zhou Xin

    1997-01-01

    The author is to introduce virtual instrument system and basic features of its typical software development platform, and show this system's superiority and fitness to physical experiments by the example of the CAMAC model ADC2249A, which is often used in nuclear physics experiments

  18. Magnetic flux tube models in superstring theory

    CERN Document Server

    Russo, Jorge G

    1996-01-01

    Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...

  19. Instrumentation

    International Nuclear Information System (INIS)

    Umminger, K.

    2008-01-01

    A proper measurement of the relevant single and two-phase flow parameters is the basis for the understanding of many complex thermal-hydraulic processes. Reliable instrumentation is therefore necessary for the interaction between analysis and experiment especially in the field of nuclear safety research where postulated accident scenarios have to be simulated in experimental facilities and predicted by complex computer code systems. The so-called conventional instrumentation for the measurement of e. g. pressures, temperatures, pressure differences and single phase flow velocities is still a solid basis for the investigation and interpretation of many phenomena and especially for the understanding of the overall system behavior. Measurement data from such instrumentation still serves in many cases as a database for thermal-hydraulic system codes. However some special instrumentation such as online concentration measurement for boric acid in the water phase or for non-condensibles in steam atmosphere as well as flow visualization techniques were further developed and successfully applied during the recent years. Concerning the modeling needs for advanced thermal-hydraulic codes, significant advances have been accomplished in the last few years in the local instrumentation technology for two-phase flow by the application of new sensor techniques, optical or beam methods and electronic technology. This paper will give insight into the current state of instrumentation technology for safety-related thermohydraulic experiments. Advantages and limitations of some measurement processes and systems will be indicated as well as trends and possibilities for further development. Aspects of instrumentation in operating reactors will also be mentioned.

  20. Quantifying and handling errors in instrumental measurements using the measurement error theory

    DEFF Research Database (Denmark)

    Andersen, Charlotte Møller; Bro, R.; Brockhoff, P.B.

    2003-01-01

    . This is a new way of using the measurement error theory. Reliability ratios illustrate that the models for the two fish species are influenced differently by the error. However, the error seems to influence the predictions of the two reference measures in the same way. The effect of using replicated x...... measurements. A new general formula is given for how to correct the least squares regression coefficient when a different number of replicated x-measurements is used for prediction than for calibration. It is shown that the correction should be applied when the number of replicates in prediction is less than...

  1. Models of Regge behaviour in an asymptotically free theory

    International Nuclear Information System (INIS)

    Polkinghorne, J.C.

    1976-01-01

    Two simple Feynman integral models are presented which reproduce the features expected to be of physical importance in the Regge behaviour of asymptotically free theories. Analysis confirms the result, expected on general grounds, that phi 3 in six dimensions has an essential singularity at l=-1. The extension to gauge theories is discussed. (Auth.)

  2. A model of theory-practice relations in mathematics teacher education

    DEFF Research Database (Denmark)

    Østergaard, Kaj

    2016-01-01

    The paper presents and discusses an ATD based (Chevallard, 2012) model of theory-practice relations in mathematics teacher education. The notions of didactic transposition and praxeology are combined and concretized in order to form a comprehensive model for analysing the theory......-practice problematique. It is illustrated how the model can be used both as a descriptive tool to analyse interactions between and interviews with student teachers and teachers and as a normative tool to design and redesign learning environments in teacher education in this case a lesson study context....

  3. Towards a Semantic E-Learning Theory by Using a Modelling Approach

    Science.gov (United States)

    Yli-Luoma, Pertti V. J.; Naeve, Ambjorn

    2006-01-01

    In the present study, a semantic perspective on e-learning theory is advanced and a modelling approach is used. This modelling approach towards the new learning theory is based on the four SECI phases of knowledge conversion: Socialisation, Externalisation, Combination and Internalisation, introduced by Nonaka in 1994, and involving two levels of…

  4. Image reconstruction design of industrial CT instrument for teaching

    International Nuclear Information System (INIS)

    Zou Yongning; Cai Yufang

    2009-01-01

    Industrial CT instrument for teaching is applied to teaching and study in field of physics and radiology major, image reconstruction is an important part of software on CT instrument. The paper expatiate on CT physical theory and first generation CT reconstruction algorithm, describe scan process of industrial CT instrument for teaching; analyze image artifact as result of displacement of rotation center, implement method of center displacement correcting, design and complete image reconstruction software, application shows that reconstructed image is very clear and qualitatively high. (authors)

  5. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...

  6. Educational Theory and Classroom Behavior.

    Science.gov (United States)

    Swanson, Ronald G.; Smith, William S.

    1979-01-01

    Described are two instruments used in a workshop designed to help teachers clarify their own beliefs about education and to shape their classroom behavior accordingly. The Student-Content Inventory concerns styles of student-teacher interaction and the Educational Theory Inventory correlates the respondent's beliefs to major educational theories.…

  7. Using circuit theory to model connectivity in ecology, evolution, and conservation.

    Science.gov (United States)

    McRae, Brad H; Dickson, Brett G; Keitt, Timothy H; Shah, Viral B

    2008-10-01

    Connectivity among populations and habitats is important for a wide range of ecological processes. Understanding, preserving, and restoring connectivity in complex landscapes requires connectivity models and metrics that are reliable, efficient, and process based. We introduce a new class of ecological connectivity models based in electrical circuit theory. Although they have been applied in other disciplines, circuit-theoretic connectivity models are new to ecology. They offer distinct advantages over common analytic connectivity models, including a theoretical basis in random walk theory and an ability to evaluate contributions of multiple dispersal pathways. Resistance, current, and voltage calculated across graphs or raster grids can be related to ecological processes (such as individual movement and gene flow) that occur across large population networks or landscapes. Efficient algorithms can quickly solve networks with millions of nodes, or landscapes with millions of raster cells. Here we review basic circuit theory, discuss relationships between circuit and random walk theories, and describe applications in ecology, evolution, and conservation. We provide examples of how circuit models can be used to predict movement patterns and fates of random walkers in complex landscapes and to identify important habitat patches and movement corridors for conservation planning.

  8. On the structural logic of curriculum system for the optical instrument major

    Science.gov (United States)

    Yan, Yufeng; Yan, Juncen; Li, Yang; Shi, Lixia

    2017-08-01

    The theories of optical instrument are the Interdisciplinary of Optical Engineering and Instrument Science and Technology. The undergraduates should study the knowledge about the optics, precision machine and electronics. The courses such as Theory of Machine, Engineering Optics, even include some courses about Accuracy Analysis of Instrument are offered in the college. There are a lot of correlatives among these courses. This paper focuses on the structural logic of these courses. The order of these courses is researched, The aims of all the courses are clear completely to avoid the same topics to be taught twice in different courses. Therefore, the undergraduates would get the main line of the knowledge, and the professors would teach efficiently.

  9. Keynes's theories of money and banking in the Treatise and The General Theory

    OpenAIRE

    John Smithin

    2013-01-01

    This paper identifies what seem to have been the five main issues in contention in monetary theory, both historically and in the current era, and discusses the view that J.M. Keynes took on each of them in the Treatise on Money and The General Theory. The key issues in monetary theory are the ontology of money, endogenous versus exogenous money, interest-rate determination, the choice of the monetary policy instrument, and the neutrality versus non-neutrality of money.

  10. A Membrane Model from Implicit Elasticity Theory

    Science.gov (United States)

    Freed, A. D.; Liao, J.; Einstein, D. R.

    2014-01-01

    A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079

  11. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  12. Collective learning modeling based on the kinetic theory of active particles.

    Science.gov (United States)

    Burini, D; De Lillo, S; Gibelli, L

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Introduction to instrumentation and measurements

    CERN Document Server

    Northrop, Robert B

    2014-01-01

    Weighing in on the growth of innovative technologies, the adoption of new standards, and the lack of educational development as it relates to current and emerging applications, the third edition of Introduction to Instrumentation and Measurements uses the authors' 40 years of teaching experience to expound on the theory, science, and art of modern instrumentation and measurements (I&M). What's New in This Edition: This edition includes material on modern integrated circuit (IC) and photonic sensors, micro-electro-mechanical (MEM) and nano-electro-mechanical (NEM) sensors, chemical and radiation sensors, signal conditioning, noise, data interfaces, and basic digital signal processing (DSP), and upgrades every chapter with the latest advancements. It contains new material on the designs of micro-electro-mechanical (MEMS) sensors, adds two new chapters on wireless instrumentation and microsensors, and incorporates extensive biomedical examples and problems. Containing 13 chapters, this third edition: Describ...

  14. A Practise-based Theory of SEIDET Smart Community Centre Model

    CSIR Research Space (South Africa)

    Phahlamohlaka, J

    2015-11-01

    Full Text Available , as it is designed using the international studies and theories. This paper presents the design of the smart community centre model. The design is described using Practice Theory concepts towards an empirical study that will be conducted using the General...

  15. Introduction to focused ion beams instrumentation, theory, techniques and practice

    CERN Document Server

    Giannuzzi, Lucille A

    2005-01-01

    The focused ion beam (FIB) instrument has experienced an intensive period of maturation since its inception. Numerous new techniques and applications have been brought to fruition, and over the past few years, the FIB has gained acceptance as more than just an expensive sample preparation tool. It has taken its place among the suite of other instruments commonly available in analytical and forensic laboratories, universities, geological, medical and biological research institutions, and manufacturing plants. Although the utility of the FIB is not limited to the preparation of specimens for subsequent analysis by other analytical techniques, it has revolutionized the area of TEM specimen preparation. The FIB has also been used to prepare samples for numerous other analytical techniques, and offers a wide range of other capabilities. While the mainstream of FIB usage remains within the semiconductor industry, FIB usage has expanded to applications in metallurgy, ceramics, composites, polymers, geology, art, bio...

  16. A system-theory-based model for monthly river runoff forecasting: model calibration and optimization

    Directory of Open Access Journals (Sweden)

    Wu Jianhua

    2014-03-01

    Full Text Available River runoff is not only a crucial part of the global water cycle, but it is also an important source for hydropower and an essential element of water balance. This study presents a system-theory-based model for river runoff forecasting taking the Hailiutu River as a case study. The forecasting model, designed for the Hailiutu watershed, was calibrated and verified by long-term precipitation observation data and groundwater exploitation data from the study area. Additionally, frequency analysis, taken as an optimization technique, was applied to improve prediction accuracy. Following model optimization, the overall relative prediction errors are below 10%. The system-theory-based prediction model is applicable to river runoff forecasting, and following optimization by frequency analysis, the prediction error is acceptable.

  17. Situational theory of leadership.

    Science.gov (United States)

    Waller, D J; Smith, S R; Warnock, J T

    1989-11-01

    The situational theory of leadership and the LEAD instruments for determining leadership style are explained, and the application of the situational leadership theory to the process of planning for and implementing organizational change is described. Early studies of leadership style identified two basic leadership styles: the task-oriented autocratic style and the relationship-oriented democratic style. Subsequent research found that most leaders exhibited one of four combinations of task and relationship behaviors. The situational leadership theory holds that the difference between the effectiveness and ineffectiveness of the four leadership styles is the appropriateness of the leader's behavior to the particular situation in which it is used. The task maturity of the individual or group being led must also be accounted for; follower readiness is defined in terms of the capacity to set high but attainable goals, willingness or ability to accept responsibility, and possession of the necessary education or experience for a specific task. A person's leadership style, range, and adaptability can be determined from the LEADSelf and LEADOther questionnaires. By applying the principles of the situational leadership theory and adapting their managerial styles to specific tasks and levels of follower maturity, the authors were successful in implementing 24-hour pharmacokinetic dosing services provided by staff pharmacists with little previous experience in clinical services. The situational leadership model enables a leader to identify a task, set goals, determine the task maturity of the individual or group, select an appropriate leadership style, and modify the style as change occurs. Pharmacy managers can use this model when implementing clinical pharmacy services.

  18. Perturbation theory instead of large scale shell model calculations

    International Nuclear Information System (INIS)

    Feldmeier, H.; Mankos, P.

    1977-01-01

    Results of large scale shell model calculations for (sd)-shell nuclei are compared with a perturbation theory provides an excellent approximation when the SU(3)-basis is used as a starting point. The results indicate that perturbation theory treatment in an SU(3)-basis including 2hω excitations should be preferable to a full diagonalization within the (sd)-shell. (orig.) [de

  19. Effective-field-theory model for the fractional quantum Hall effect

    International Nuclear Information System (INIS)

    Zhang, S.C.; Hansson, T.H.; Kivelson, S.

    1989-01-01

    Starting directly from the microscopic Hamiltonian, we derive a field-theory model for the fractional quantum hall effect. By considering an approximate coarse-grained version of the same model, we construct a Landau-Ginzburg theory similar to that of Girvin. The partition function of the model exhibits cusps as a function of density and the Hall conductance is quantized at filling factors ν = (2k-1)/sup -1/ with k an arbitrary integer. At these fractions the ground state is incompressible, and the quasiparticles and quasiholes have fractional charge and obey fractional statistics. Finally, we show that the collective density fluctuations are massive

  20. Models for Theory-Based M.A. and Ph.D. Programs.

    Science.gov (United States)

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  1. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  2. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  3. Theory to practice: the humanbecoming leading-following model.

    Science.gov (United States)

    Ursel, Karen L

    2015-01-01

    Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.

  4. Agent-based models for higher-order theory of mind

    NARCIS (Netherlands)

    de Weerd, Harmen; Verbrugge, Rineke; Verheij, Bart; Kamiński, Bogumił; Koloch, Grzegorz

    2014-01-01

    Agent-based models are a powerful tool for explaining the emergence of social phenomena in a society. In such models, individual agents typically have little cognitive ability. In this paper, we model agents with the cognitive ability to make use of theory of mind. People use this ability to reason

  5. Modeling 13.3nm Fe XXIII Flare Emissions Using the GOES-R EXIS Instrument

    Science.gov (United States)

    Rook, H.; Thiemann, E.

    2017-12-01

    The solar EUV spectrum is dominated by atomic transitions in ionized atoms in the solar atmosphere. As solar flares evolve, plasma temperatures and densities change, influencing abundances of various ions, changing intensities of different EUV wavelengths observed from the sun. Quantifying solar flare spectral irradiance is important for constraining models of Earth's atmosphere, improving communications quality, and controlling satellite navigation. However, high time cadence measurements of flare irradiance across the entire EUV spectrum were not available prior to the launch of SDO. The EVE MEGS-A instrument aboard SDO collected 0.1nm EUV spectrum data from 2010 until 2014, when the instrument failed. No current or future instrument is capable of similar high resolution and time cadence EUV observation. This necessitates a full EUV spectrum model to study EUV phenomena at Earth. It has been recently demonstrated that one hot flare EUV line, such as the 13.3nm Fe XXIII line, can be used to model cooler flare EUV line emissions, filling the role of MEGS-A. Since unblended measurements of Fe XXIII are typically unavailable, a proxy for the Fe XXIII line must be found. In this study, we construct two models of this line, first using the GOES 0.1-0.8nm soft x-ray (SXR) channel as the Fe XXIII proxy, and second using a physics-based model dependent on GOES emission measure and temperature data. We determine that the more sophisticated physics-based model shows better agreement with Fe XXIII measurements, although the simple proxy model also performs well. We also conclude that the high correlation between Fe XXIII emissions and the GOES 0.1-0.8nm band is because both emissions tend to peak near the GOES emission measure peak despite large differences in their contribution functions.

  6. Development of a Conceptual Model and Survey Instrument to Measure Conscientious Objection to Abortion Provision.

    Directory of Open Access Journals (Sweden)

    Laura Florence Harris

    Full Text Available Conscientious objection to abortion, clinicians' refusal to perform legal abortions because of their religious or moral beliefs, has been the subject of increasing debate among bioethicists, policymakers, and public health advocates in recent years. Conscientious objection policies are intended to balance reproductive rights and clinicians' beliefs. However, in practice, clinician objection can act as a barrier to abortion access-impinging on reproductive rights, and increasing unsafe abortion and related morbidity and mortality. There is little information about conscientious objection from a medical or public health perspective. A quantitative instrument is needed to assess prevalence of conscientious objection and to provide insight on its practice. This paper describes the development of a survey instrument to measure conscientious objection to abortion provision.A literature review, and in-depth formative interviews with stakeholders in Colombia were used to develop a conceptual model of conscientious objection. This model led to the development of a survey, which was piloted, and then administered, in Ghana.The model posits three domains of conscientious objection that form the basis for the survey instrument: 1 beliefs about abortion and conscientious objection; 2 actions related to conscientious objection and abortion; and 3 self-identification as a conscientious objector.The instrument is intended to be used to assess prevalence among clinicians trained to provide abortions, and to gain insight on how conscientious objection is practiced in a variety of settings. Its results can inform more effective and appropriate strategies to regulate conscientious objection.

  7. A critical assessment of theories/models used in health communication for HIV/AIDS.

    Science.gov (United States)

    Airhihenbuwa, C O; Obregon, R

    2000-01-01

    Most theories and models used to develop human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) communication are based on social psychology that emphasizes individualism. Researchers including communication and health scholars are now questioning the presumed global relevance of these models and thus the need to develop innovative theories and models that take into account regional contexts. In this paper, we discuss the commonly used theories and models in HIV/AIDS communication. Furthermore, we argue that the flaws in the application of the commonly used "classical" models in health communication are because of contextual differences in locations where these models are applied. That is to say that these theories and models are being applied in contexts for which they were not designed. For example, the differences in health behaviors are often the function of culture. Therefore, culture should be viewed for its strength and not always as a barrier. The metaphorical coupling of "culture" and "barrier" needs to be exposed, deconstructed, and reconstructed so that new, positive, cultural linkages can be forged. The HIV/AIDS pandemic has served as a flashpoint to either highlight the importance or deny the relevance of theories and models while at the same time addressing the importance of culture in the development and implementation of communication programs.

  8. Scaling theory of depinning in the Sneppen model

    International Nuclear Information System (INIS)

    Maslov, S.; Paczuski, M.

    1994-01-01

    We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. 69, 3539 (1992)]. This theory is based on a ''gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, ν parallel (d) and ν perpendicular (d), characterizing the divergence of the parallel and perpendicular correlation lengths as the interface approaches its dynamical attractor

  9. PARFUME Theory and Model basis Report

    Energy Technology Data Exchange (ETDEWEB)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  10. A practitioner's guide to persuasion: an overview of 15 selected persuasion theories, models and frameworks.

    Science.gov (United States)

    Cameron, Kenzie A

    2009-03-01

    To provide a brief overview of 15 selected persuasion theories and models, and to present examples of their use in health communication research. The theories are categorized as message effects models, attitude-behavior approaches, cognitive processing theories and models, consistency theories, inoculation theory, and functional approaches. As it is often the intent of a practitioner to shape, reinforce, or change a patient's behavior, familiarity with theories of persuasion may lead to the development of novel communication approaches with existing patients. This article serves as an introductory primer to theories of persuasion with applications to health communication research. Understanding key constructs and general formulations of persuasive theories may allow practitioners to employ useful theoretical frameworks when interacting with patients.

  11. Prospects for advanced RF theory and modeling

    International Nuclear Information System (INIS)

    Batchelor, D. B.

    1999-01-01

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed. (c) 1999 American Institute of Physics

  12. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    International Nuclear Information System (INIS)

    Saraswati, Teguh Endah; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH 3 ). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory. (paper)

  13. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    Science.gov (United States)

    Endah Saraswati, Teguh; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH3). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory.

  14. Phase Structure Of Fuzzy Field Theories And Multi trace Matrix Models

    International Nuclear Information System (INIS)

    Tekel, J.

    2015-01-01

    We review the interplay of fuzzy field theories and matrix models, with an emphasis on the phase structure of fuzzy scalar field theories. We give a self-contained introduction to these topics and give the details concerning the saddle point approach for the usual single trace and multi trace matrix models. We then review the attempts to explain the phase structure of the fuzzy field theory using a corresponding random matrix ensemble, showing the strength and weaknesses of this approach. We conclude with a list of challenges one needs to overcome and the most interesting open problems one can try to solve. (author)

  15. A Dynamic Systems Theory Model of Visual Perception Development

    Science.gov (United States)

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  16. Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games

    Science.gov (United States)

    Alber, Julia M.; Watson, Anna M.; Barnett, Tracey E.; Mercado, Rebeccah

    2015-01-01

    Abstract Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development. PMID:26167842

  17. Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games.

    Science.gov (United States)

    Alber, Julia M; Watson, Anna M; Barnett, Tracey E; Mercado, Rebeccah; Bernhardt, Jay M

    2015-07-01

    Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development.

  18. Johnson-Laird's mental models theory and its principles: an application with cell mental models of high school students

    OpenAIRE

    Mª Luz Rodríguez Palmero; Javier Marrero Acosta; Marco Antonio Moreira

    2001-01-01

    Following a discussion of Johnson-Laird's mental models theory, we report a study regarding high school students mental representations of cell, understood as mental models. Research findings suggest the appropriatedness of such a theory as a framework to interpret students' representations.

  19. Causal Agency Theory: Reconceptualizing a Functional Model of Self-Determination

    Science.gov (United States)

    Shogren, Karrie A.; Wehmeyer, Michael L.; Palmer, Susan B.; Forber-Pratt, Anjali J.; Little, Todd J.; Lopez, Shane

    2015-01-01

    This paper introduces Causal Agency Theory, an extension of the functional model of self-determination. Causal Agency Theory addresses the need for interventions and assessments pertaining to selfdetermination for all students and incorporates the significant advances in understanding of disability and in the field of positive psychology since the…

  20. A general-model-space diagrammatic perturbation theory

    International Nuclear Information System (INIS)

    Hose, G.; Kaldor, U.

    1980-01-01

    A diagrammatic many-body perturbation theory applicable to arbitrary model spaces is presented. The necessity of having a complete model space (all possible occupancies of the partially-filled shells) is avoided. This requirement may be troublesome for systems with several well-spaced open shells, such as most atomic and molecular excited states, as a complete model space spans a very broad energy range and leaves out states within that range, leading to poor or no convergence of the perturbation series. The method presented here would be particularly useful for such states. The solution of a model problem (He 2 excited Σ + sub(g) states) is demonstrated. (Auth.)

  1. Plane symmetric cosmological micro model in modified theory of Einstein’s general relativity

    Directory of Open Access Journals (Sweden)

    Panigrahi U.K.

    2003-01-01

    Full Text Available In this paper, we have investigated an anisotropic homogeneous plane symmetric cosmological micro-model in the presence of massless scalar field in modified theory of Einstein's general relativity. Some interesting physical and geometrical aspects of the model together with singularity in the model are discussed. Further, it is shown that this theory is valid and leads to Ein­stein's theory as the coupling parameter λ →>• 0 in micro (i.e. quantum level in general.

  2. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  3. Nonlinear structural mechanics theory, dynamical phenomena and modeling

    CERN Document Server

    Lacarbonara, Walter

    2013-01-01

    Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...

  4. Quantum integrable models of field theory

    International Nuclear Information System (INIS)

    Faddeev, L.D.

    1979-01-01

    Fundamental features of the classical method of the inverse problem have been formulated in the form which is convenient for its quantum reformulation. Typical examples are studied which may help to formulate the quantum method of the inverse problem. Examples are considered for interaction with both attraction and repulsion at a final density. The sine-Gordon model and the XYZ model from the quantum theory of magnetics are examined in short. It is noted that all the achievements of the one-dimensional mathematical physics as applied to exactly solvable quantum models may be put to an extent within the framework of the quantum method of the inverse problem. Unsolved questions are enumerated and perspectives of applying the inverse problem method are shown

  5. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.

    1986-01-01

    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions....

  6. Ottawa Model of Implementation Leadership and Implementation Leadership Scale: mapping concepts for developing and evaluating theory-based leadership interventions.

    Science.gov (United States)

    Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A

    2017-01-01

    Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models.

  7. Ottawa Model of Implementation Leadership and Implementation Leadership Scale: mapping concepts for developing and evaluating theory-based leadership interventions

    Science.gov (United States)

    Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A

    2017-01-01

    Purpose Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Methods Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. Results All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. Conclusion The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models. PMID:29355212

  8. A conceptual framework for organismal biology: linking theories, models, and data.

    Science.gov (United States)

    Zamer, William E; Scheiner, Samuel M

    2014-11-01

    Implicit or subconscious theory is especially common in the biological sciences. Yet, theory plays a variety of roles in scientific inquiry. First and foremost, it determines what does and does not count as a valid or interesting question or line of inquiry. Second, theory determines the background assumptions within which inquiries are pursued. Third, theory provides linkages among disciplines. For these reasons, it is important and useful to develop explicit theories for biology. A general theory of organisms is developed, which includes 10 fundamental principles that apply to all organisms, and 6 that apply to multicellular organisms only. The value of a general theory comes from its utility to help guide the development of more specific theories and models. That process is demonstrated by examining two domains: ecoimmunology and development. For the former, a constitutive theory of ecoimmunology is presented, and used to develop a specific model that explains energetic trade-offs that may result from an immunological response of a host to a pathogen. For the latter, some of the issues involved in trying to devise a constitutive theory that covers all of development are explored, and a more narrow theory of phenotypic novelty is presented. By its very nature, little of a theory of organisms will be new. Rather, the theory presented here is a formal expression of nearly two centuries of conceptual advances and practice in research. Any theory is dynamic and subject to debate and change. Such debate will occur as part of the present, initial formulation, as the ideas presented here are refined. The very process of debating the form of the theory acts to clarify thinking. The overarching goal is to stimulate debate about the role of theory in the study of organisms, and thereby advance our understanding of them. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology 2014. This work is written by US Government employees

  9. Designing means and specifications for model FT-619 kidney function instrument

    International Nuclear Information System (INIS)

    Yu Yongding

    1988-04-01

    In this paper, it is pointed out that the model FT-619 Kidney Function Equipment is a new cost-effective nuclear medicine instrument, which takes the leading position in China. The performance of the model FT-619,especially the lead collimated scintillation detector has reached the same level as the advanced equipment in the world market. It is also described in this article in detail that the design of the lead collimator and the shielding as well as the detection efficiency have achieved the optimum level and that a comparison has been made with foreign products

  10. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  11. Group theory for unified model building

    International Nuclear Information System (INIS)

    Slansky, R.

    1981-01-01

    The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)

  12. δ expansion for local gauge theories. I. A one-dimensional model

    International Nuclear Information System (INIS)

    Bender, C.M.; Cooper, F.; Milton, K.A.; Moshe, M.; Pinsky, S.S.; Simmons, L.M. Jr.

    1992-01-01

    The principles of the δ perturbation theory were first proposed in the context of self-interacting scalar quantum field theory. There it was shown how to expand a (φ 2 ) 1+δ theory as a series in powers of δ and how to recover nonperturbative information about a φ 4 field theory from the δ expansion at δ=1. The purpose of this series of papers is to extend the notions of δ perturbation theory from boson theories to theories having a local gauge symmetry. In the case of quantum electrodynamics one introduces the parameter δ by generalizing the minimal coupling terms to bar ψ(∂-ieA) δ ψ and expanding in powers of δ. This interaction preserves local gauge invariance for all δ. While there are enormous benefits in using the δ expansion (obtaining nonperturbative results), gauge theories present new technical difficulties not encountered in self-interacting boson theories because the expression (∂-ieA) δ contains a derivative operator. In the first paper of this series a one-dimensional model whose interaction term has the form bar ψ[d/dt-igφ(t)] δ ψ is considered. The virtue of this model is that it provides a laboratory in which to study fractional powers of derivative operators without the added complexity of γ matrices. In the next paper of this series we consider two-dimensional electrodynamics and show how to calculate the anomaly in the δ expansion

  13. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    Science.gov (United States)

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  14. Matrix models from localization of five-dimensional supersymmetric noncommutative U(1) gauge theory

    International Nuclear Information System (INIS)

    Lee, Bum-Hoon; Ro, Daeho; Yang, Hyun Seok

    2017-01-01

    We study localization of five-dimensional supersymmetric U(1) gauge theory on S 3 ×ℝ θ 2 where ℝ θ 2 is a noncommutative (NC) plane. The theory can be isomorphically mapped to three-dimensional supersymmetric U(N→∞) gauge theory on S 3 using the matrix representation on a separable Hilbert space on which NC fields linearly act. Therefore the NC space ℝ θ 2 allows for a flexible path to derive matrix models via localization from a higher-dimensional supersymmetric NC U(1) gauge theory. The result shows a rich duality between NC U(1) gauge theories and large N matrix models in various dimensions.

  15. Motivation for Instrument Education: A Study from the Perspective of Expectancy-Value and Flow Theories

    Science.gov (United States)

    Burak, Sabahat

    2014-01-01

    Problem Statement: In the process of instrument education, students being unwilling (lacking motivation) to play an instrument or to practise is a problem that educators frequently face. Recognizing the factors motivating the students will yield useful results for instrument educators in terms of developing correct teaching methods and approaches.…

  16. Modelling of XCO2 Surfaces Based on Flight Tests of TanSat Instruments

    Directory of Open Access Journals (Sweden)

    Li Li Zhang

    2016-11-01

    Full Text Available The TanSat carbon satellite is to be launched at the end of 2016. In order to verify the performance of its instruments, a flight test of TanSat instruments was conducted in Jilin Province in September, 2015. The flight test area covered a total area of about 11,000 km2 and the underlying surface cover included several lakes, forest land, grassland, wetland, farmland, a thermal power plant and numerous cities and villages. We modeled the column-average dry-air mole fraction of atmospheric carbon dioxide (XCO2 surface based on flight test data which measured the near- and short-wave infrared (NIR reflected solar radiation in the absorption bands at around 760 and 1610 nm. However, it is difficult to directly analyze the spatial distribution of XCO2 in the flight area using the limited flight test data and the approximate surface of XCO2, which was obtained by regression modeling, which is not very accurate either. We therefore used the high accuracy surface modeling (HASM platform to fill the gaps where there is no information on XCO2 in the flight test area, which takes the approximate surface of XCO2 as its driving field and the XCO2 observations retrieved from the flight test as its optimum control constraints. High accuracy surfaces of XCO2 were constructed with HASM based on the flight’s observations. The results showed that the mean XCO2 in the flight test area is about 400 ppm and that XCO2 over urban areas is much higher than in other places. Compared with OCO-2’s XCO2, the mean difference is 0.7 ppm and the standard deviation is 0.95 ppm. Therefore, the modelling of the XCO2 surface based on the flight test of the TanSat instruments fell within an expected and acceptable range.

  17. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  18. Multilevel Higher-Order Item Response Theory Models

    Science.gov (United States)

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  19. Advances in cognitive theory and therapy: the generic cognitive model.

    Science.gov (United States)

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  20. Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.

    Science.gov (United States)

    McPhail, Beverly A

    2016-07-01

    The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy. © The Author(s) 2015.

  1. Conformal field theories, Coulomb gas picture and integrable models

    International Nuclear Information System (INIS)

    Zuber, J.B.

    1988-01-01

    The aim of the study is to present the links between some results of conformal field theory, the conventional Coulomb gas picture in statistical mechanics and the approach of integrable models. It is shown that families of conformal theories, related by the coset construction to the SU(2) Kac-Moody algebra, may be regarded as obtained from some free field, and modified by the coupling of its winding numbers to floating charges. This representation reflects the procedure of restriction of the corresponding integrable lattice models. The work may be generalized to models based on the coset construction with higher rank algebras. The corresponding integrable models are identified. In the conformal field description, generalized parafermions appear, and are coupled to free fields living on a higher-dimensional torus. The analysis is not as exhaustive as in the SU(2) case: all the various restrictions have not been identified, nor the modular invariants completely classified

  2. A game theory-based trust measurement model for social networks.

    Science.gov (United States)

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  3. ExoMars Trace Gas Orbiter Instrument Modelling Approach to Streamline Science Operations

    Science.gov (United States)

    Munoz Fernandez, Michela; Frew, David; Ashman, Michael; Cardesin Moinelo, Alejandro; Garcia Beteta, Juan Jose; Geiger, Bernhard; Metcalfe, Leo; Nespoli, Federico; Muniz Solaz, Carlos

    2018-05-01

    ExoMars Trace Gas Orbiter (TGO) science operations activities are centralised at ESAC's Science Operations Centre (SOC). The SOC receives the inputs from the principal investigators (PIs) in order to implement and deliver the spacecraft pointing requests and instrument timelines to the Mission Operations Centre (MOC). The high number of orbits per planning cycle has made it necessary to abstract the planning interactions between the SOC and the PI teams at the observation level. This paper describes the modelling approach we have conducted for TGOís instruments to streamline science operations. We have created dynamic observation types that scale to adapt to the conditions specified by the PI teams including observation timing, and pointing block parameters calculated from observation geometry. This approach is considered and improvement with respect to previous missions where the generation of the observation pointing and commanding requests was performed manually by the instrument teams. Automation software assists us to effectively handle the high density of planned orbits with increasing volume of scientific data and to successfully meet opportunistic scientific goals and objectives. Our planning tool combines the instrument observation definition files provided by the PIs together with the flight dynamics products to generate the Pointing Requests and the instrument timeline (ITL). The ITL contains all the validated commands at the TC sequence level and computes the resource envelopes (data rate, power, data volume) within the constraints. At the SOC, our main goal is to maximise the science output while minimising the number of iterations among the teams, ensuring that the timeline does not violate the state transitions allowed in the Mission Operations Rules and Constraints Document.

  4. Suprathermal ions in the solar wind from the Voyager spacecraft: Instrument modeling and background analysis

    International Nuclear Information System (INIS)

    Randol, B M; Christian, E R

    2015-01-01

    Using publicly available data from the Voyager Low Energy Charged Particle (LECP) instruments, we investigate the form of the solar wind ion suprathermal tail in the outer heliosphere inside the termination shock. This tail has a commonly observed form in the inner heliosphere, that is, a power law with a particular spectral index. The Voyager spacecraft have taken data beyond 100 AU, farther than any other spacecraft. However, during extended periods of time, the data appears to be mostly background. We have developed a technique to self-consistently estimate the background seen by LECP due to cosmic rays using data from the Voyager cosmic ray instruments and a simple, semi-analytical model of the LECP instruments

  5. Musical Instrument Classification Based on Nonlinear Recurrence Analysis and Supervised Learning

    Directory of Open Access Journals (Sweden)

    R.Rui

    2013-04-01

    Full Text Available In this paper, the phase space reconstruction of time series produced by different instruments is discussed based on the nonlinear dynamic theory. The dense ratio, a novel quantitative recurrence parameter, is proposed to describe the difference of wind instruments, stringed instruments and keyboard instruments in the phase space by analyzing the recursive property of every instrument. Furthermore, a novel supervised learning algorithm for automatic classification of individual musical instrument signals is addressed deriving from the idea of supervised non-negative matrix factorization (NMF algorithm. In our approach, the orthogonal basis matrix could be obtained without updating the matrix iteratively, which NMF is unable to do. The experimental results indicate that the accuracy of the proposed method is improved by 3% comparing with the conventional features in the individual instrument classification.

  6. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection

    Science.gov (United States)

    Reason, Robert D.; Kimball, Ezekiel W.

    2012-01-01

    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  7. Modeling and Theories of Pathophysiology and Physiology of the Basal Ganglia–Thalamic–Cortical System: Critical Analysis

    Science.gov (United States)

    Montgomery Jr., Erwin B.

    2016-01-01

    Theories impact the movement disorders clinic, not only affecting the development of new therapies but determining how current therapies are used. Models are theories that are procedural rather than declarative. Theories and models are important because, as argued by Kant, one cannot know the thing-in-itself (das Ding an sich) and only a model is knowable. Further, biological variability forces higher level abstraction relevant for all variants. It is that abstraction that is raison d’être of theories and models. Theories “connect the dots” to move from correlation to causation. The necessity of theory makes theories helpful or counterproductive. Theories and models of the pathophysiology and physiology of the basal ganglia–thalamic–cortical system do not spontaneously arise but have a history and consequently are legacies. Over the last 40 years, numerous theories and models of the basal ganglia have been proposed only to be forgotten or dismissed, rarely critiqued. It is not harsh to say that current popular theories positing increased neuronal activities in the Globus Pallidus Interna (GPi), excessive beta oscillations and increased synchronization not only fail to provide an adequate explication but are inconsistent with many observations. It is likely that their shared intellectual and epistemic inheritance plays a factor in their shared failures. These issues are critically examined. How one is to derive theories and models and have hope these will be better is explored as well. PMID:27708569

  8. Exploratory and Creative Properties of Physical-Modeling-based Musical Instruments

    DEFF Research Database (Denmark)

    Gelineck, Steven

    Digital musical instruments are developed to enable musicians to find new ways of expressing themselves. The development and evaluation of these instruments can be approached from many different perspectives depending on which capabilities one wants the musicians to have. This thesis attempts...... to approach development and evaluation of these instruments with the notion that instruments today are able to facilitate the creative process that is so crucial for creating music. The fundamental question pursued throughout the thesis is how creative work processes of composers of electronic music can...... be supported and even challenged by the instruments they use. What is it that makes one musical instrument more creatively inspiring than another, and how do we evaluate how well it succeeds? In order to present answers to these questions, the thesis focusses on the sound synthesis technique of physical...

  9. Closing the gap between values and behavior: A means-end theory of lifestyle

    DEFF Research Database (Denmark)

    Brunsø, Karen; Scholderer, Joachim; Grunert, Klaus G.

    Means-end chain theory and lifestyle are reconstructed within a dual-process framework, incorporating bottom-up and top-down information-processing routes. The bottom-up route is defined as a hierarchical categorization process, and the top-down route as goal-directed action. Lifestyle, then...... condition for both information-processing routes to reach their ends, predicting a strict mediation model. The model is tested on survey data gathered in France in 1998, using the list of values as a measure of abstract goal states, the food-related lifestyle instrument as a measure of intervening knowledge...... structures, and a newly constructed behavior list as a measure of behavior. Data were analyzed by means of structural equation modeling. Compared against five alternative model structures, the strict mediation model fitted the data best, thus confirming the predictions derived from the reconstructed theory....

  10. Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons.

    Science.gov (United States)

    Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang

    2017-07-06

    Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.

  11. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  12. NASA Instrument Cost Model for Explorer-Like Mission Instruments (NICM-E)

    Science.gov (United States)

    Habib-Agahi, Hamid; Fox, George; Mrozinski, Joe; Ball, Gary

    2013-01-01

    NICM-E is a cost estimating relationship that supplements the traditional NICM System Level CERs for instruments flown on NASA Explorer-like missions that have the following three characteristics: 1) fly on Class C missions, 2) major development led and performed by universities or research foundations, and 3) have significant level of inheritance.

  13. A qualitative evaluation of policy instruments used to improve energy performance of existing private dwellings in the Netherlands

    International Nuclear Information System (INIS)

    Murphy, Lorraine; Meijer, Frits; Visscher, Henk

    2012-01-01

    Climate change policies in the Netherlands recognise the importance of existing dwellings. Efforts to gain these energy savings are led at national level by policy instruments such as the Energy Performance Certificate, covenants, economic and information tools. These instruments reflect a policy style described as consensus based and incentivising. However, this approach has been subject to criticism with suggestions that alternatives are required. As a first step towards conceptualising alternatives previous evaluations and stakeholder interviews are used to assess instruments. Elements from the theory based evaluation method combined with concepts from policy instrument and energy policy literature form an evaluation framework. Results demonstrate weak impact of some key instruments. Underlying theories associated with instruments are often lost during implementation or remain unsubstantiated. Policy instrument and energy policy concepts are evident but are far from pervasive. Results show that current instruments are poorly equipped to forge a long-term energy saving strategy for existing dwellings. It is further demonstrated that complexity with existing dwellings is not only limited to frequently cited barriers but to the intricacies of designing and operating a well-orchestrated instrument mix. - Highlights: ► Instruments are evaluated using the theory based method and normative concepts. ► Lack of monitoring and evaluation data affects impact assessment. ► Impact and normative concepts are reflected in part in individual instruments. ► A coherent strategy that demonstrates impact and reflects concepts is absent. ► Results form a first step from which to conceptualise alternatives.

  14. Two problems from the theory of semiotic control models. I. Representations of semiotic models

    Energy Technology Data Exchange (ETDEWEB)

    Osipov, G S

    1981-11-01

    Two problems from the theory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of themtheory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of them. Algebraic representation of semiotic models, covering of representations, their reduction and equivalence are discussed. The interrelations between functional and structural characteristics of semiotic models are investigated. 20 references.

  15. Theory and theory-based models for the pedestal, edge stability and ELMs in tokamaks

    International Nuclear Information System (INIS)

    Guzdar, P.N.; Mahajan, S.M.; Yoshida, Z.; Dorland, W.; Rogers, B.N.; Bateman, G.; Kritz, A.H.; Pankin, A.; Voitsekhovitch, I.; Onjun, T.; Snyder, S.

    2005-01-01

    Theories for equilibrium and stability of H-modes, and models for use within integrated modeling codes with the objective of predicting the height, width and shape of the pedestal at the edge of H-mode plasmas in tokamaks, as well as the onset and frequency of Edge Localized Modes (ELMs), are developed. A theory model for relaxed plasma states with flow, which uses two-fluid Hall-MHD equations, predicts that the natural scale length of the pedestal is the ion skin depth and the pedestal width is larger than the ion poloidal gyro-radius, in agreement with experimental observations. Computations with the GS2 code are used to identify micro-instabilities, such as electron drift waves, that survive the strong flow shear, diamagnetic flows, and magnetic shear that are characteristic of the pedestal. Other instabilities on the pedestal and gyro-radius scale, such as the Kelvin-Helmholtz instability, are also investigated. Time-dependent integrated modeling simulations are used to follow the transition from L-mode to H-mode and the subsequent evolution of ELMs as the heating power is increased. The flow shear stabilization that produces the transport barrier at the edge of the plasma reduces different modes of anomalous transport and, consequently, different channels of transport at different rates. ELM crashes are triggered in the model by pressure-driven ballooning modes or by current-driven peeling modes. (author)

  16. Functional techniques in quantum field theory and two-dimensional models

    International Nuclear Information System (INIS)

    Souza, C. Farina de.

    1985-03-01

    Functional methods applied to Quantum Field Theory are studied. It is shown how to construct the Generating Functional using three of the most important methods existent in the literature, due to Feynman, Symanzik and Schwinger. The Axial Anomaly is discussed in the usual way, and a non perturbative method due to Fujikawa to obtain this anomaly in the path integral formalism is presented. The ''Roskies-Shaposnik-Fujikawa's method'', which makes use of Fujikawa's original idea to solve bidimensional models, is introduced in the Schwinger's model, which, in turn, is applied to obtain the exact solution of the axial model. It is discussed briefly how different regularization procedures can affect the theory in question. (author)

  17. Mean field theory of nuclei and shell model. Present status and future outlook

    International Nuclear Information System (INIS)

    Nakada, Hitoshi

    2003-01-01

    Many of the recent topics of the nuclear structure are concerned on the problems of unstable nuclei. It has been revealed experimentally that the nuclear halos and the neutron skins as well as the cluster structures or the molecule-like structures can be present in the unstable nuclei, and the magic numbers well established in the stable nuclei disappear occasionally while new ones appear. The shell model based on the mean field approximation has been successfully applied to stable nuclei to explain the nuclear structure as the finite many body system quantitatively and it is considered as the standard model at present. If the unstable nuclei will be understood on the same model basis or not is a matter related to fundamental principle of nuclear structure theories. In this lecture, the fundamental concept and the framework of the theory of nuclear structure based on the mean field theory and the shell model are presented to make clear the problems and to suggest directions for future researches. At first fundamental properties of nuclei are described under the subtitles: saturation and magic numbers, nuclear force and effective interactions, nuclear matter, and LS splitting. Then the mean field theory is presented under subtitles: the potential model, the mean field theory, Hartree-Fock approximation for nuclear matter, density dependent force, semiclassical mean field theory, mean field theory and symmetry, Skyrme interaction and density functional, density matrix expansion, finite range interactions, effective masses, and motion of center of mass. The subsequent section is devoted to the shell model with the subtitles: beyond the mean field approximation, core polarization, effective interaction of shell model, one-particle wave function, nuclear deformation and shell model, and shell model of cross shell. Finally structure of unstable nuclei is discussed with the subtitles: general remark on the study of unstable nuclear structure, asymptotic behavior of wave

  18. Symmetry-guided large-scale shell-model theory

    Czech Academy of Sciences Publication Activity Database

    Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.

    2016-01-01

    Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell -model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016

  19. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  20. Growing up and Role Modeling: A Theory in Iranian Nursing Students? Education

    OpenAIRE

    Nouri, Jamileh Mokhtari; Ebadi, Abbas; Alhani, Fatemeh; Rejeh, Nahid

    2014-01-01

    One of the key strategies in students? learning is being affected by models. Understanding the role-modeling process in education will help to make greater use of this training strategy. The aim of this grounded theory study was to explore Iranian nursing students and instructors? experiences about role modeling process. Data was analyzed by Glaserian?s Grounded Theory methodology through semi-structured interviews with 7 faculty members, 2 nursing students; the three focus group discussions ...

  1. Using the Rasch measurement model to design a report writing assessment instrument.

    Science.gov (United States)

    Carlson, Wayne R

    2013-01-01

    This paper describes how the Rasch measurement model was used to develop an assessment instrument designed to measure student ability to write law enforcement incident and investigative reports. The ability to write reports is a requirement of all law enforcement recruits in the state of Michigan and is a part of the state's mandatory basic training curriculum, which is promulgated by the Michigan Commission on Law Enforcement Standards (MCOLES). Recently, MCOLES conducted research to modernize its training and testing in the area of report writing. A structured validation process was used, which included: a) an examination of the job tasks of a patrol officer, b) input from content experts, c) a review of the professional research, and d) the creation of an instrument to measure student competency. The Rasch model addressed several measurement principles that were central to construct validity, which were particularly useful for assessing student performances. Based on the results of the report writing validation project, the state established a legitimate connectivity between the report writing standard and the essential job functions of a patrol officer in Michigan. The project also produced an authentic instrument for measuring minimum levels of report writing competency, which generated results that are valid for inferences of student ability. Ultimately, the state of Michigan must ensure the safety of its citizens by licensing only those patrol officers who possess a minimum level of core competency. Maintaining the validity and reliability of both the training and testing processes can ensure that the system for producing such candidates functions as intended.

  2. Development of a theory-based (PEN-3 and Health Belief Model), culturally relevant intervention on cervical cancer prevention among Latina immigrants using intervention mapping.

    Science.gov (United States)

    Scarinci, Isabel C; Bandura, Lisa; Hidalgo, Bertha; Cherrington, Andrea

    2012-01-01

    The development of efficacious theory-based, culturally relevant interventions to promote cervical cancer prevention among underserved populations is crucial to the elimination of cancer disparities. The purpose of this article is to describe the development of a theory-based, culturally relevant intervention focusing on primary (sexual risk reduction) and secondary (Pap smear) prevention of cervical cancer among Latina immigrants using intervention mapping (IM). The PEN-3 and Health Belief Model provided theoretical guidance for the intervention development and implementation. IM provides a logical five-step framework in intervention development: delineating proximal program objectives, selecting theory-based intervention methods and strategies, developing a program plan, planning for adoption in implementation, and creating evaluation plans and instruments. We first conducted an extensive literature review and qualitatively examined the sociocultural factors associated with primary and secondary prevention of cervical cancer. We then proceeded to quantitatively validate the qualitative findings, which led to development matrices linking the theoretical constructs with intervention objectives and strategies as well as evaluation. IM was a helpful tool in the development of a theory-based, culturally relevant intervention addressing primary and secondary prevention among Latina immigrants.

  3. Consistent constraints on the Standard Model Effective Field Theory

    International Nuclear Information System (INIS)

    Berthier, Laure; Trott, Michael

    2016-01-01

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.

  4. 2 + 1 quantum gravity as a toy model for the 3 + 1 theory

    International Nuclear Information System (INIS)

    Ashtekar, A.; Husain, V.; Smolin, L.; Samuel, J.; Utah Univ., Salt Lake City, UT

    1989-01-01

    2 + 1 Einstein gravity is used as a toy model for testing a program for non-perturbative canonical quantisation of the 3 + 1 theory. The program can be successfully implemented in the model and leads to a surprisingly rich quantum theory. (author)

  5. Two experimental tests of relational models of procedural justice: non-instrumental voice and authority group membership.

    Science.gov (United States)

    Platow, Michael J; Eggins, Rachael A; Chattopadhyay, Rachana; Brewer, Greg; Hardwick, Lisa; Milsom, Laurin; Brocklebank, Jacinta; Lalor, Thérèse; Martin, Rowena; Quee, Michelle; Vassallo, Sara; Welsh, Jenny

    2013-06-01

    In both a laboratory experiment (in Australia) using university as the basis of group membership, and a scenario experiment (in India) using religion as the basis of group membership, we observe more favourable respect and fairness ratings in response to an in-group authority than an out-group authority who administers non-instrumental voice. Moreover, we observe in our second experiment that reported likelihood of protest (herein called "social-change voice") was relatively high following non-instrumental voice from an out-group authority, but relatively low following non-instrumental voice from an in-group authority. Our findings are consistent with relational models of procedural justice, and extend the work by examining likely use of alternative forms of voice as well as highlighting the relative importance of instrumentality. ©2012 The British Psychological Society.

  6. Deformed type 0A matrix model and super-Liouville theory for fermionic black holes

    International Nuclear Information System (INIS)

    Ahn, Changrim; Kim, Chanju; Park, Jaemo; Suyama, Takao; Yamamoto, Masayoshi

    2006-01-01

    We consider a c-circumflex = 1 model in the fermionic black hole background. For this purpose we consider a model which contains both the N 1 and the N = 2 super-Liouville interactions. We propose that this model is dual to a recently proposed type 0A matrix quantum mechanics model with vortex deformations. We support our conjecture by showing that non-perturbative corrections to the free energy computed by both the matrix model and the super-Liouville theories agree exactly by treating the N = 2 interaction as a small perturbation. We also show that a two-point function on sphere calculated from the deformed type 0A matrix model is consistent with that of the N = 2 super-Liouville theory when the N = 1 interaction becomes small. This duality between the matrix model and super-Liouville theories leads to a conjecture for arbitrary n-point correlation functions of the N = 1 super-Liouville theory on the sphere

  7. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  8. Prospect Theory in the Heterogeneous Agent Model

    Czech Academy of Sciences Publication Activity Database

    Polach, J.; Kukačka, Jiří

    (2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economic s OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438.pdf

  9. Model building with a dynamical volume element in gravity, particle theory and theories of extended object

    International Nuclear Information System (INIS)

    Guendelman, E.

    2004-01-01

    Full Text:The Volume Element of Space Time can be considered as a geometrical object which can be independent of the metric. The use in the action of a volume element which is metric independent leads to the appearance of a measure of integration which is metric independent. This can be applied to all known generally coordinate invariant theories, we will discuss three very important cases: 1. 4-D theories describing gravity and matter fields, 2. Parametrization invariant theories of extended objects and 3. Higher dimensional theories including gravity and matter fields. In case 1, a large number of new effects appear: (i) spontaneous breaking of scale invariance associated to integration of degrees of freedom related to the measure, (ii) under normal particle physics laboratory conditions fermions split into three families, but when matter is highly diluted, neutrinos increase their mass and become suitable candidates for dark matter, (iii) cosmic coincidence between dark energy and dark matter is natural, (iv) quintessence scenarios with automatic decoupling of the quintessence scalar to ordinary matter, but not dark matter are obtained (2) For theories or extended objects, the use of a measure of integration independent of the metric leads to (i) dynamical tension, (ii) string models of non abelian confinement (iii) The possibility of new Weyl invariant light-like branes (WTT.L branes). These Will branes dynamically adjust themselves to sit at black hole horizons and in the context of higher dimensional theories can provide examples of massless 4-D particles with nontrivial Kaluza Klein quantum numbers, (3) In Bronx and Kaluza Klein scenarios, the use of a measure independent of the metric makes it possible to construct naturally models where only the extra dimensions get curved and the 4-D observable space-time remain flat

  10. Anomaly-free gauges in superstring theory and double supersymmetric sigma-model

    International Nuclear Information System (INIS)

    Demichev, A.P.; Iofa, M.Z.

    1991-01-01

    Superharmonic gauge which is a nontrivial analog of the harmonic gauge in bosonic string theory is constructed for the fermionic superstrings. In contrast to the conformal gauge, the harmonic gauge in bosonic string and superharmonic gauge in superstring theory are shown to be free from previously discovered BRST anomaly (in critical dimension) in higher orders of string perturbation theory and thus provide the setup for consistent quantization of (super)string theory. Superharmonic gauge appears to be closely connected with the supersymmetric σ-model with the target space being also a supermanifold. 28 refs

  11. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  12. Exact string theory model of closed timelike curves and cosmological singularities

    International Nuclear Information System (INIS)

    Johnson, Clifford V.; Svendsen, Harald G.

    2004-01-01

    We study an exact model of string theory propagating in a space-time containing regions with closed timelike curves (CTCs) separated from a finite cosmological region bounded by a big bang and a big crunch. The model is an nontrivial embedding of the Taub-NUT geometry into heterotic string theory with a full conformal field theory (CFT) definition, discovered over a decade ago as a heterotic coset model. Having a CFT definition makes this an excellent laboratory for the study of the stringy fate of CTCs, the Taub cosmology, and the Milne/Misner-type chronology horizon which separates them. In an effort to uncover the role of stringy corrections to such geometries, we calculate the complete set of α ' corrections to the geometry. We observe that the key features of Taub-NUT persist in the exact theory, together with the emergence of a region of space with Euclidean signature bounded by timelike curvature singularities. Although such remarks are premature, their persistence in the exact geometry is suggestive that string theory is able to make physical sense of the Milne/Misner singularities and the CTCs, despite their pathological character in general relativity. This may also support the possibility that CTCs may be viable in some physical situations, and may be a natural ingredient in pre-big bang cosmological scenarios

  13. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  14. From mechanism to virtue: Evaluating Nudge theory

    NARCIS (Netherlands)

    Kosters, M.; van der Heijden, J.

    2015-01-01

    Ever since Thaler and Sunstein published their influential book Nudge, the book and the theory it presents have received great praise and opposition. Nudge theory, and more particularly, nudging may be considered an additional strategy providing some novel instruments to the already rich governance

  15. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  16. Background field method in gauge theories and on linear sigma models

    International Nuclear Information System (INIS)

    van de Ven, A.E.M.

    1986-01-01

    This dissertation constitutes a study of the ultraviolet behavior of gauge theories and two-dimensional nonlinear sigma-models by means of the background field method. After a general introduction in chapter 1, chapter 2 presents algorithms which generate the divergent terms in the effective action at one-loop for arbitrary quantum field theories in flat spacetime of dimension d ≤ 11. It is demonstrated that global N = 1 supersymmetric Yang-Mills theory in six dimensions in one-loop UV-finite. Chapter 3 presents an algorithm which produces the divergent terms in the effective action at two-loops for renormalizable quantum field theories in a curved four-dimensional background spacetime. Chapter 4 presents a study of the two-loop UV-behavior of two-dimensional bosonic and supersymmetric non-linear sigma-models which include a Wess-Zumino-Witten term. It is found that, to this order, supersymmetric models on quasi-Ricci flat spaces are UV-finite and the β-functions for the bosonic model depend only on torsionful curvatures. Chapter 5 summarizes a superspace calculation of the four-loop β-function for two-dimensional N = 1 and N = 2 supersymmetric non-linear sigma-models. It is found that besides the one-loop contribution which vanishes on Ricci-flat spaces, the β-function receives four-loop contributions which do not vanish in the Ricci-flat case. Implications for superstrings are discussed. Chapters 6 and 7 treat the details of these calculations

  17. The Five-Factor Model and Self-Determination Theory

    DEFF Research Database (Denmark)

    Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette

    This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...

  18. Perturbation theory around the Wess-Zumino-Witten model

    International Nuclear Information System (INIS)

    Hasseln, H. v.

    1991-05-01

    We consider a perturbation of the Wess-Zumino-Witten model in 2D by a current-current interaction. The β-function is computed to third order in the coupling constant and a nontrivial fixedpoint is found. By non-abelian bosonization, this perturbed WZW-model is shown to have the same β-function (at least to order g 2 ) as the fermionic theory with a four-fermion interaction. (orig.) [de

  19. Theories and control models and motor learning: clinical applications in neuro-rehabilitation.

    Science.gov (United States)

    Cano-de-la-Cuerda, R; Molero-Sánchez, A; Carratalá-Tejada, M; Alguacil-Diego, I M; Molina-Rueda, F; Miangolarra-Page, J C; Torricelli, D

    2015-01-01

    In recent decades there has been a special interest in theories that could explain the regulation of motor control, and their applications. These theories are often based on models of brain function, philosophically reflecting different criteria on how movement is controlled by the brain, each being emphasised in different neural components of the movement. The concept of motor learning, regarded as the set of internal processes associated with practice and experience that produce relatively permanent changes in the ability to produce motor activities through a specific skill, is also relevant in the context of neuroscience. Thus, both motor control and learning are seen as key fields of study for health professionals in the field of neuro-rehabilitation. The major theories of motor control are described, which include, motor programming theory, systems theory, the theory of dynamic action, and the theory of parallel distributed processing, as well as the factors that influence motor learning and its applications in neuro-rehabilitation. At present there is no consensus on which theory or model defines the regulations to explain motor control. Theories of motor learning should be the basis for motor rehabilitation. The new research should apply the knowledge generated in the fields of control and motor learning in neuro-rehabilitation. Copyright © 2011 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.

  20. I can do that: the impact of implicit theories on leadership role model effectiveness.

    Science.gov (United States)

    Hoyt, Crystal L; Burnette, Jeni L; Innella, Audrey N

    2012-02-01

    This research investigates the role of implicit theories in influencing the effectiveness of successful role models in the leadership domain. Across two studies, the authors test the prediction that incremental theorists ("leaders are made") compared to entity theorists ("leaders are born") will respond more positively to being presented with a role model before undertaking a leadership task. In Study 1, measuring people's naturally occurring implicit theories of leadership, the authors showed that after being primed with a role model, incremental theorists reported greater leadership confidence and less anxious-depressed affect than entity theorists following the leadership task. In Study 2, the authors demonstrated the causal role of implicit theories by manipulating participants' theory of leadership ability. They replicated the findings from Study 1 and demonstrated that identification with the role model mediated the relationship between implicit theories and both confidence and affect. In addition, incremental theorists outperformed entity theorists on the leadership task.

  1. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  2. Theory for the three-dimensional Mercedes-Benz model of water

    Science.gov (United States)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.

    2009-11-01

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  3. Theory for the three-dimensional Mercedes-Benz model of water.

    Science.gov (United States)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A

    2009-11-21

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  4. Traffic Games: Modeling Freeway Traffic with Game Theory.

    Science.gov (United States)

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  5. Modelling machine ensembles with discrete event dynamical system theory

    Science.gov (United States)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  6. Theories and Frameworks for Online Education: Seeking an Integrated Model

    Science.gov (United States)

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  7. Economic instruments for environmental policy making in Ontario

    International Nuclear Information System (INIS)

    Barg, S.; Duraiappah, A.; Van Exan, S.

    2000-01-01

    The conditions and approaches required for a successful implementation of economic instruments in Ontario are reviewed. The advantages and disadvantages of economic instruments are discussed, as are some design issues. Some best practices and practical experiences from Canada, the United States, and Europe are examined through the use of nine specific case studies. Each one highlights a different environmental challenge, such as energy efficiency, air pollution, water pollution, waste management along with the solutions that were implemented. The situations described were not all successful, but there is much to be learned from unsuccessful episodes. Lessons learned from the review of the case studies were presented. The points to ponder when using economic instruments in Ontario were highlighted. The command and control policy instrument must be kept in context when considering economic instruments. The reasons that underline the preference of the economic theory for economic instruments are discussed. The different types of economic instruments are described, and the considerations related to the design and comparison of economic instruments is briefly discussed. The authors concluded with several points to ponder: there are a number of options available, details must not be neglected, consultation with the interested parties is important, there is a need for frequent reassessment, and using a number of instruments is helpful. 55 refs., tabs., figs

  8. Halo modelling in chameleon theories

    Energy Technology Data Exchange (ETDEWEB)

    Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  9. Halo modelling in chameleon theories

    International Nuclear Information System (INIS)

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu

    2014-01-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations

  10. Risk Route Choice Analysis and the Equilibrium Model under Anticipated Regret Theory

    Directory of Open Access Journals (Sweden)

    pengcheng yuan

    2014-02-01

    Full Text Available The assumption about travellers’ route choice behaviour has major influence on the traffic flow equilibrium analysis. Previous studies about the travellers’ route choice were mainly based on the expected utility maximization theory. However, with the gradually increasing knowledge about the uncertainty of the transportation system, the researchers have realized that there is much constraint in expected util­ity maximization theory, because expected utility maximiza­tion requires travellers to be ‘absolutely rational’; but in fact, travellers are not truly ‘absolutely rational’. The anticipated regret theory proposes an alternative framework to the tra­ditional risk-taking in route choice behaviour which might be more scientific and reasonable. We have applied the antici­pated regret theory to the analysis of the risk route choosing process, and constructed an anticipated regret utility func­tion. By a simple case which includes two parallel routes, the route choosing results influenced by the risk aversion degree, regret degree and the environment risk degree have been analyzed. Moreover, the user equilibrium model based on the anticipated regret theory has been established. The equivalence and the uniqueness of the model are proved; an efficacious algorithm is also proposed to solve the model. Both the model and the algorithm are demonstrated in a real network. By an experiment, the model results and the real data have been compared. It was found that the model re­sults can be similar to the real data if a proper regret degree parameter is selected. This illustrates that the model can better explain the risk route choosing behaviour. Moreover, it was also found that the traveller’ regret degree increases when the environment becomes more and more risky.

  11. Theory of positive disintegration as a model of adolescent development.

    Science.gov (United States)

    Laycraft, Krystyna

    2011-01-01

    This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.

  12. Goodness-of-Fit Assessment of Item Response Theory Models

    Science.gov (United States)

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  13. Mixmaster cosmological model in theories of gravity with a quadratic Lagrangian

    International Nuclear Information System (INIS)

    Barrow, J.D.; Sirousse-Zia, H.

    1989-01-01

    We use the method of matched asymptotic expansions to examine the behavior of the vacuum Bianchi type-IX mixmaster universe in a gravity theory derived from a purely quadratic gravitational Lagrangian. The chaotic behavior characteristic of the general-relativistic mixmaster model disappears and the asymptotic behavior is of the monotonic, nonchaotic form found in the exactly soluble Bianchi type-I models of the quadratic theory. The asymptotic behavior far from the singularity is also found to be of monotonic nonchaotic type

  14. Models in cooperative game theory

    CERN Document Server

    Branzei, Rodica; Tijs, Stef

    2008-01-01

    This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.

  15. 3rd Symposium on Space Optical Instruments and Applications

    CERN Document Server

    Zhang, Guangjun

    2017-01-01

    This volume contains selected and expanded contributions presented at the 3rd Symposium on Space Optical Instruments and Applications in Beijing, China June 28 – 29, 2016. This conference series is organised by the Sino-Holland Space Optical Instruments Laboratory, a cooperation platform between China and the Netherlands. The symposium focused on key technological problems of optical instruments and their applications in a space context. It covered the latest developments, experiments and results regarding theory, instrumentation and applications in space optics. The book is split across five topical sections. The first section covers space optical remote sensing system design, the second advanced optical system design, the third remote sensor calibration and measurement. Remote sensing data processing and information extraction is then presented, followed by a final section on remote sensing data applications. .

  16. Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items

    Science.gov (United States)

    Aybek, Eren Can; Demirtasli, R. Nukhet

    2017-01-01

    This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…

  17. Lenses on reading an introduction to theories and models

    CERN Document Server

    Tracey, Diane H

    2017-01-01

    Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a

  18. A Method for Modeling the Virtual Instrument Automatic Test System Based on the Petri Net

    Institute of Scientific and Technical Information of China (English)

    MA Min; CHEN Guang-ju

    2005-01-01

    Virtual instrument is playing the important role in automatic test system. This paper introduces a composition of a virtual instrument automatic test system and takes the VXIbus based a test software platform which is developed by CAT lab of the UESTC as an example. Then a method to model this system based on Petri net is proposed. Through this method, we can analyze the test task scheduling to prevent the deadlock or resources conflict. At last, this paper analyzes the feasibility of this method.

  19. A spatial Mankiw-Romer-Weil model: Theory and evidence

    OpenAIRE

    Fischer, Manfred M.

    2009-01-01

    This paper presents a theoretical growth model that extends the Mankiw-Romer-Weil [MRW] model by accounting for technological interdependence among regional economies. Interdependence is assumed to work through spatial externalities caused by disembodied knowledge diffusion. The transition from theory to econometrics leads to a reduced-form empirical spatial Durbin model specification that explains the variation in regional levels of per worker output at steady state. A system ...

  20. Comparison of potential models through heavy quark effective theory

    International Nuclear Information System (INIS)

    Amundson, J.F.

    1995-01-01

    I calculate heavy-light decay constants in a nonrelativistic potential model. The resulting estimate of heavy quark symmetry breaking conflicts with similar estimates from lattice QCD. I show that a semirelativistic potential model eliminates the conflict. Using the results of heavy quark effective theory allows me to identify and compensate for shortcomings in the model calculations in addition to isolating the source of the differences in the two models. The results lead to a rule as to where the nonrelativistic quark model gives misleading predictions

  1. Refined pipe theory for mechanistic modeling of wood development.

    Science.gov (United States)

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  2. The spin-s quantum Heisenberg ferromagnetic models in the physical magnon theory

    International Nuclear Information System (INIS)

    Liu, B.-G.; Pu, F.-C.

    2001-01-01

    The spin-s quantum Heisenberg ferromagnetic model is investigated in the physical magnon theory. The effect of the extra unphysical magnon states on every site is completely removed in the magnon Hamiltonian and during approximation procedure so that the condition †n i a n i >=0(n≥2s+1) is rigorously satisfied. The physical multi-magnon occupancy †n i a n i >(1≤n≤2s) is proportional to T 3n/2 at low temperature and is equivalent to 1/(2s+1) at the Curie temperature. The magnetization not only unified but also well-behaved from zero temperature to Curie temperature is obtained in the framework of the magnon theory for the spin-s quantum Heisenberg ferromagnetic model. The ill-behaved magnetizations at high temperature in earlier magnon theories are completely corrected. The relation of magnon (spin wave) theory with spin-operator decoupling theory is clearly understood

  3. Quantum analysis of Jackiw and Teitelboim's model for (1+1)D gravity and topological gauge theory

    International Nuclear Information System (INIS)

    Terao, Haruhiko

    1993-01-01

    We study the BRST quantization of the (1+1)-dimensional gravity model proposed by Jackiw and Teitelboim and also the topological gauge model which is equivalent to the gravity model at least classically. The gravity model quantized in the light-cone gauge is found to be a free theory with a nilpotent BRST charge. We show also that there exist twisted N=2 superconformal algebras in the Jackiw-Teitelboim model as well as in the topological gauge model. We discuss the quantum equivalence between the gravity theory and the topological gauge theory. It is shown that these theories are indeed equivalent to each other in the light-cone gauge. (orig.)

  4. Eroding market stability by proliferation of financial instruments

    Science.gov (United States)

    Caccioli, F.; Marsili, M.; Vivo, P.

    2009-10-01

    We contrast Arbitrage Pricing Theory (APT), the theoretical basis for the development of financial instruments, with a dynamical picture of an interacting market, in a simple setting. The proliferation of financial instruments apparently provides more means for risk diversification, making the market more efficient and complete. In the simple market of interacting traders discussed here, the proliferation of financial instruments erodes systemic stability and it drives the market to a critical state characterized by large susceptibility, strong fluctuations and enhanced correlations among risks. This suggests that the hypothesis of APT may not be compatible with a stable market dynamics. In this perspective, market stability acquires the properties of a common good, which suggests that appropriate measures should be introduced in derivative markets, to preserve stability. in here

  5. Complexity in quantum field theory and physics beyond the standard model

    International Nuclear Information System (INIS)

    Goldfain, Ervin

    2006-01-01

    Complex quantum field theory (abbreviated c-QFT) is introduced in this paper as an alternative framework for the description of physics beyond the energy range of the standard model. The mathematics of c-QFT is based on fractal differential operators that generalize the momentum operators of conventional quantum field theory (QFT). The underlying premise of our approach is that c-QFT contains the right analytical tools for dealing with the asymptotic regime of QFT. Canonical quantization of c-QFT leads to the following findings: (i) the Fock space of c-QFT includes fractional numbers of particles and antiparticles per state (ii) c-QFT represents a generalization of topological field theory and (iii) classical limit of c-QFT is equivalent to field theory in curved space-time. The first finding provides a field-theoretic motivation for the transfinite discretization approach of El-Naschie's ε (∞) theory. The second and third findings suggest the dynamic unification of boson and fermion fields as particles with fractional spin, as well as the close connection between spin and space-time topology beyond the conventional physics of the standard model

  6. Complexity in quantum field theory and physics beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Goldfain, Ervin [OptiSolve Consulting, 4422 Cleveland Road, Syracuse, NY 13215 (United States)

    2006-05-15

    Complex quantum field theory (abbreviated c-QFT) is introduced in this paper as an alternative framework for the description of physics beyond the energy range of the standard model. The mathematics of c-QFT is based on fractal differential operators that generalize the momentum operators of conventional quantum field theory (QFT). The underlying premise of our approach is that c-QFT contains the right analytical tools for dealing with the asymptotic regime of QFT. Canonical quantization of c-QFT leads to the following findings: (i) the Fock space of c-QFT includes fractional numbers of particles and antiparticles per state (ii) c-QFT represents a generalization of topological field theory and (iii) classical limit of c-QFT is equivalent to field theory in curved space-time. The first finding provides a field-theoretic motivation for the transfinite discretization approach of El-Naschie's {epsilon} {sup ({infinity}}{sup )} theory. The second and third findings suggest the dynamic unification of boson and fermion fields as particles with fractional spin, as well as the close connection between spin and space-time topology beyond the conventional physics of the standard model.

  7. Interacting bosons model and relation with BCS theory

    International Nuclear Information System (INIS)

    Diniz, R.

    1990-01-01

    The Nambu mechanism for BCS theory is extended with inclusion of quadrupole pairing in addition to the usual monopole pairing. An effective Hamiltonian is constructed and its relation to the IBM is discussed. The faced difficulties and a possible generalization of this model are discussed. (author)

  8. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  9. The Formation of Instruments of Management of Industrial Enterprises According to the Theoretical and Functional Approaches

    Directory of Open Access Journals (Sweden)

    Raiko Diana V.

    2018-03-01

    Full Text Available The article is aimed at the substantiation based on the analysis of the company theories of the basic theoretical provisions on the formation of industrial enterprise management instruments. The article determines that the subject of research in theories is enterprise, the object is the process of management of potential according to the forms of business organization and technology of partnership relations, the goal is high financial results, stabilization of the activity, and social responsibility. The publication carries out an analysis of enterprise theories on the determining of its essence as a socio-economic system in the following directions: technical preparation of production, economic theory and law, theory of systems, marketing-management. As a result of the research, the general set of functions has been identified – the socio-economic functions of enterprise by groups: information-legal, production, marketing-management, social responsibility. When building management instruments, it is suggested to take into consideration the direct and inverse relationships of enterprise at all levels of management – micro, meso and macro. On this ground, the authors have developed provisions on formation of instruments of management of industrial enterprises according to two approaches – theoretical and functional.

  10. A thermostatted kinetic theory model for event-driven pedestrian dynamics

    Science.gov (United States)

    Bianca, Carlo; Mogno, Caterina

    2018-06-01

    This paper is devoted to the modeling of the pedestrian dynamics by means of the thermostatted kinetic theory. Specifically the microscopic interactions among pedestrians and an external force field are modeled for simulating the evacuation of pedestrians from a metro station. The fundamentals of the stochastic game theory and the thermostatted kinetic theory are coupled for the derivation of a specific mathematical model which depicts the time evolution of the distribution of pedestrians at different exits of a metro station. The perturbation theory is employed in order to establish the stability analysis of the nonequilibrium stationary states in the case of a metro station consisting of two exits. A general sensitivity analysis on the initial conditions, the magnitude of the external force field and the number of exits is presented by means of numerical simulations which, in particular, show how the asymptotic distribution and the convergence time are affected by the presence of an external force field. The results show how, in evacuation conditions, the interaction dynamics among pedestrians can be negligible with respect to the external force. The important role of the thermostat term in allowing the reaching of the nonequilibrium stationary state is stressed out. Research perspectives are underlined at the end of paper, in particular for what concerns the derivation of frameworks that take into account the definition of local external actions and the introduction of the space and velocity dynamics.

  11. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  12. Theory and experiments in model-based space system anomaly management

    Science.gov (United States)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  13. Theory-based interventions in physical activity: a systematic review of literature in Iran.

    Science.gov (United States)

    Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya

    2014-11-30

    Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied .Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested.

  14. Dual-process models of health-related behaviour and cognition: a review of theory.

    Science.gov (United States)

    Houlihan, S

    2018-03-01

    The aim of this review was to synthesise a spectrum of theories incorporating dual-process models of health-related behaviour. Review of theory, adapted loosely from Cochrane-style systematic review methodology. Inclusion criteria were specified to identify all relevant dual-process models that explain decision-making in the context of decisions made about human health. Data analysis took the form of iterative template analysis (adapted from the conceptual synthesis framework used in other reviews of theory), and in this way theories were synthesised on the basis of shared theoretical constructs and causal pathways. Analysis and synthesis proceeded in turn, instead of moving uni-directionally from analysis of individual theories to synthesis of multiple theories. Namely, the reviewer considered and reconsidered individual theories and theoretical components in generating the narrative synthesis' main findings. Drawing on systematic review methodology, 11 electronic databases were searched for relevant dual-process theories. After de-duplication, 12,198 records remained. Screening of title and abstract led to the exclusion of 12,036 records, after which 162 full-text records were assessed. Of those, 21 records were included in the review. Moving back and forth between analysis of individual theories and the synthesis of theories grouped on the basis of theme or focus yielded additional insights into the orientation of a theory to an individual. Theories could be grouped in part on their treatment of an individual as an irrational actor, as social actor, as actor in a physical environment or as a self-regulated actor. Synthesising identified theories into a general dual-process model of health-related behaviour indicated that such behaviour is the result of both propositional and unconscious reasoning driven by an individual's response to internal cues (such as heuristics, attitude and affect), physical cues (social and physical environmental stimuli) as well as

  15. [Styles of conflict management among nurses. Instrument validation].

    Science.gov (United States)

    Francisco, M T; Clos, A C; dos Santos, I; Larrubia, E de O

    1997-01-01

    The present study has as an object the modalities of conflict administration adopted by nurses in professional praxis. Considering the Management Grid Theory (BLAKE & MOUNTON, 1978), the conflict can be solved in different levels of quality or even, not solved, influenced by the manager behavior model. It is intended to identify nurses managing styles in conflict administration, analyzing their interactions. A questionnaire composed with 25 items, in its majority of popular adagios which express the five basic models of the Management Grid, has been tested. The research was run at the Rio de Janeiro State University Pedro Ernesto University Hospital from June, 1996 to August, 1997, and descriptive method and check-list schedule type functional analysis technique have been used. The factorial analysis of items has showed an occurrence of eight interdependent factors which designing the following styles that have been adopted by nurses: confrontation, negotiation, facing, conciliation, manipulation, acceptance, submission and withdrawing. The authors recommend data collect instrument revalidation.

  16. Instrument evaluation no. 11. ESI nuclear model 271 C contamination monitor

    CERN Document Server

    Burgess, P H

    1978-01-01

    The various radiations encountered in radiological protection cover a wide range of energies and radiation measurements have to he carried out under an equally broad spectrum of environmental conditions. This report is one of a series intended to give information on the performance characteristics of radiological protection instruments, to assist in the selection of appropriate instruments for a given purpose, to interpret the results obtained with such instruments, and, in particular, to know the likely sources and magnitude of errors that might be associated with measurements in the field. The radiation, electrical and environmental characteristics of radiation protection instruments are considered together with those aspects of the construction which make an instrument convenient for routine use. To provide consistent criteria for instrument performance, the range of tests performed on any particular class of instrument, the test methods and the criteria of acceptable performance are based broadly on the a...

  17. Two-dimensional sigma models: modelling non-perturbative effects of gauge theories

    International Nuclear Information System (INIS)

    Novikov, V.A.; Shifman, M.A.; Vainshtein, A.I.; Zakharov, V.I.

    1984-01-01

    The review is devoted to a discussion of non-perturbative effects in gauge theories and two-dimensional sigma models. The main emphasis is put on supersymmetric 0(3) sigma model. The instanton-based method for calculating the exact Gell-Mann-Low function and bifermionic condensate is considered in detail. All aspects of the method in simplifying conditions are discussed. The basic points are: the instanton measure from purely classical analysis; a non-renormalization theorem in self-dual external fields; existence of vacuum condensates and their compatibility with supersymmetry

  18. The Scanning Theremin Microscope: A Model Scanning Probe Instrument for Hands-On Activities

    Science.gov (United States)

    Quardokus, Rebecca C.; Wasio, Natalie A.; Kandel, S. Alex

    2014-01-01

    A model scanning probe microscope, designed using similar principles of operation to research instruments, is described. Proximity sensing is done using a capacitance probe, and a mechanical linkage is used to scan this probe across surfaces. The signal is transduced as an audio tone using a heterodyne detection circuit analogous to that used in…

  19. The biopsychosocial model and its potential for a new theory of homeopathy.

    Science.gov (United States)

    Schmidt, Josef M

    2012-04-01

    Since the nineteenth century the theory of conventional medicine has been developed in close alignment with the mechanistic paradigm of natural sciences. Only in the twentieth century occasional attempts were made to (re)introduce the 'subject' into medical theory, as by Thure von Uexküll (1908-2004) who elaborated the so-called biopsychosocial model of the human being, trying to understand the patient as a unit of organic, mental, and social dimensions of life. Although widely neglected by conventional medicine, it is one of the most coherent, significant, and up-to-date models of medicine at present. Being torn between strict adherence to Hahnemann's original conceptualization and alienation caused by contemporary scientific criticism, homeopathy today still lacks a generally accepted, consistent, and definitive theory which would explain in scientific terms its strength, peculiarity, and principles without relapsing into biomedical reductionism. The biopsychosocial model of the human being implies great potential for a new theory of homeopathy, as may be demonstrated with some typical examples. Copyright © 2012. Published by Elsevier Ltd.

  20. Fluid analog model for boundary effects in field theory

    International Nuclear Information System (INIS)

    Ford, L. H.; Svaiter, N. F.

    2009-01-01

    Quantum fluctuations in the density of a fluid with a linear phonon dispersion relation are studied. In particular, we treat the changes in these fluctuations due to nonclassical states of phonons and to the presence of boundaries. These effects are analogous to similar effects in relativistic quantum field theory, and we argue that the case of the fluid is a useful analog model for effects in field theory. We further argue that the changes in the mean squared density are, in principle, observable by light scattering experiments.

  1. Non-Higgsable clusters for 4D F-theory models

    International Nuclear Information System (INIS)

    Morrison, David R.; Taylor, Washington

    2015-01-01

    We analyze non-Higgsable clusters of gauge groups and matter that can arise at the level of geometry in 4D F-theory models. Non-Higgsable clusters seem to be generic features of F-theory compactifications, and give rise naturally to structures that include the nonabelian part of the standard model gauge group and certain specific types of potential dark matter candidates. In particular, there are nine distinct single nonabelian gauge group factors, and only five distinct products of two nonabelian gauge group factors with matter, including SU(3)×SU(2), that can be realized through 4D non-Higgsable clusters. There are also more complicated configurations involving more than two gauge factors; in particular, the collection of gauge group factors with jointly charged matter can exhibit branchings, loops, and long linear chains.

  2. Thermal Testing and Model Correlation for Advanced Topographic Laser Altimeter Instrument (ATLAS)

    Science.gov (United States)

    Patel, Deepak

    2016-01-01

    The Advanced Topographic Laser Altimeter System (ATLAS) part of the Ice Cloud and Land Elevation Satellite 2 (ICESat-2) is an upcoming Earth Science mission focusing on the effects of climate change. The flight instrument passed all environmental testing at GSFC (Goddard Space Flight Center) and is now ready to be shipped to the spacecraft vendor for integration and testing. This topic covers the analysis leading up to the test setup for ATLAS thermal testing as well as model correlation to flight predictions. Test setup analysis section will include areas where ATLAS could not meet flight like conditions and what were the limitations. Model correlation section will walk through changes that had to be made to the thermal model in order to match test results. The correlated model will then be integrated with spacecraft model for on-orbit predictions.

  3. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...

  4. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been studied and some assumptions among the ...

  5. S matrix theory of the massive Thirring model

    International Nuclear Information System (INIS)

    Berg, B.

    1980-01-01

    The S matrix theory of the massive Thirring model, describing the exact quantum scattering of solitons and their boundstates, is reviewed. Treated are: Factorization equations and their solution, boundstates, generalized Jost functions and Levinson's theorem, scattering of boundstates, 'virtual' and anomalous thresholds. (orig.) 891 HSI/orig. 892 MKO

  6. Introduction of the transtheoretical model and organisational development theory in weight management: A narrative review.

    Science.gov (United States)

    Wu, Ya-Ke; Chu, Nain-Feng

    2015-01-01

    Overweight and obesity are serious public health and medical problems among children and adults worldwide. Behavioural change has been demonstrably contributory to weight management programs. Behavioural change-based weight loss programs require a theoretical framework. We will review the transtheoretical model and the organisational development theory in weight management. The transtheoretical model is a behaviour theory of individual level frequently used for weight management programs. The organisational development theory is a more complicated behaviour theory that applies to behavioural change on the system level. Both of these two theories have their respective strengths and weaknesses. In this manuscript, we try to introduce the transtheoretical model and the organisational development theory in the context of weight loss programs among population that are overweight or obese. Ultimately, we wish to present a new framework/strategy of weight management by integrating these two theories together. Copyright © 2015 Asian Oceanian Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  7. Modeling Composite Assessment Data Using Item Response Theory

    Science.gov (United States)

    Ueckert, Sebastian

    2018-01-01

    Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119

  8. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  9. Instruments to assess integrated care

    DEFF Research Database (Denmark)

    Lyngsø, Anne Marie; Godtfredsen, Nina Skavlan; Høst, Dorte

    2014-01-01

    INTRODUCTION: Although several measurement instruments have been developed to measure the level of integrated health care delivery, no standardised, validated instrument exists covering all aspects of integrated care. The purpose of this review is to identify the instruments concerning how to mea...... was prevalent. It is uncertain whether development of a single 'all-inclusive' model for assessing integrated care is desirable. We emphasise the continuing need for validated instruments embedded in theoretical contexts.......INTRODUCTION: Although several measurement instruments have been developed to measure the level of integrated health care delivery, no standardised, validated instrument exists covering all aspects of integrated care. The purpose of this review is to identify the instruments concerning how...... to measure the level of integration across health-care sectors and to assess and evaluate the organisational elements within the instruments identified. METHODS: An extensive, systematic literature review in PubMed, CINAHL, PsycINFO, Cochrane Library, Web of Science for the years 1980-2011. Selected...

  10. Systematic review of measurement properties of self-reported instruments for evaluating self-care in adults.

    Science.gov (United States)

    Matarese, Maria; Lommi, Marzia; De Marinis, Maria Grazia

    2017-06-01

    The aims of this study were as follows: to identify instruments developed to assess self-care in healthy adults; to determine the theory on which they were based; their validity and reliability properties and to synthesize the evidence on their measurement properties. Many instruments have been developed to assess self-care in many different populations and conditions. Clinicians and researchers should select the most appropriate self-care instrument based on the knowledge of their measurement properties. Systematic review of measurement instruments according to the protocol recommended by the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) panel. PubMed, Embase, PsycINFO, Scopus and CINAHL databases were searched from inception to December 2015. Studies testing measurement properties of self-report instruments assessing self-care in healthy adults, published in the English language and in peer review journals were selected. Two reviewers independently appraised the methodological quality of the studies with the COSMIN checklist and the quality of results using specific quality criteria. Twenty-six articles were included in the review testing the measurement properties of nine instruments. Seven instruments were based on Orem's Self-care theory. Not all the measurement properties were evaluated for the identified instruments. No self-care instrument showed strong evidence supporting the evaluated measurement properties. Despite the development of several instruments to assess self-care in the adult population, no instrument can be fully recommended to clinical nurses and researchers. Further studies of high methodological quality are needed to confirm the measurement properties of these instruments. © 2016 John Wiley & Sons Ltd.

  11. Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact

    Science.gov (United States)

    McQuillin, Samuel D.; Lyons, Michael D.

    2016-01-01

    This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…

  12. A General Framework for Portfolio Theory. Part I: theory and various models

    OpenAIRE

    Maier-Paape, Stanislaus; Zhu, Qiji Jim

    2017-01-01

    Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz [H. Markowitz, Portfolio Selection, 1959] and its natural generalization, the capital market pricing model, [W. F. Sharpe, Mutual fund performance , 1966] are spe...

  13. Computer Support of Groups: Theory-Based Models for GDSS Research

    OpenAIRE

    V. Srinivasan Rao; Sirkka L. Jarvenpaa

    1991-01-01

    Empirical research in the area of computer support of groups is characterized by inconsistent results across studies. This paper attempts to reconcile the inconsistencies by linking the ad hoc reasoning in the studies to existing theories of communication, minority influence and human information processing. Contingency models are then presented based on the theories discussed. The paper concludes by discussing the linkages between the current work and other recently published integrations of...

  14. Validation of Sustainable Development Practices Scale Using the Bayesian Approach to Item Response Theory

    Directory of Open Access Journals (Sweden)

    Martin Hernani Merino

    2014-12-01

    Full Text Available There has been growing recognition of the importance of creating performance measurement tools for the economic, social and environmental management of micro and small enterprise (MSE. In this context, this study aims to validate an instrument to assess perceptions of sustainable development practices by MSEs by means of a Graded Response Model (GRM with a Bayesian approach to Item Response Theory (IRT. The results based on a sample of 506 university students in Peru, suggest that a valid measurement instrument was achieved. At the end of the paper, methodological and managerial contributions are presented.

  15. Combining climate and energy policies: synergies or antagonism? Modeling interactions with energy efficiency instruments

    International Nuclear Information System (INIS)

    Lecuyer, Oskar; Bibas, Ruben

    2012-01-01

    In addition to the already present Climate and Energy package, the European Union (EU) plans to include a binding target to reduce energy consumption. We analyze the rationales the EU invokes to justify such an overlapping and develop a minimal common framework to study interactions arising from the combination of instruments reducing emissions, promoting renewable energy (RE) production and reducing energy demand through energy efficiency (EE) investments. We find that although all instruments tend to reduce GHG emissions and although a price on carbon tends also to give the right incentives for RE and EE, the combination of more than one instrument leads to significant antagonisms regarding major objectives of the policy package. The model allows to show in a single framework and to quantify the antagonistic effects of the joint promotion of RE and EE. We also show and quantify the effects of this joint promotion on ETS permit price, on wholesale market price and on energy production levels. (authors)

  16. Instrumental and ethical aspects of experimental research with animal models

    Directory of Open Access Journals (Sweden)

    Mirian Watanabe

    2014-02-01

    Full Text Available Experimental animal models offer possibilities of physiology knowledge, pathogenesis of disease and action of drugs that are directly related to quality nursing care. This integrative review describes the current state of the instrumental and ethical aspects of experimental research with animal models, including the main recommendations of ethics committees that focus on animal welfare and raises questions about the impact of their findings in nursing care. Data show that, in Brazil, the progress in ethics for the use of animals for scientific purposes was consolidated with Law No. 11.794/2008 establishing ethical procedures, attending health, genetic and experimental parameters. The application of ethics in handling of animals for scientific and educational purposes and obtaining consistent and quality data brings unquestionable contributions to the nurse, as they offer subsidies to relate pathophysiological mechanisms and the clinical aspect on the patient.

  17. Category Theory as a Formal Mathematical Foundation for Model-Based Systems Engineering

    KAUST Repository

    Mabrok, Mohamed; Ryan, Michael J.

    2017-01-01

    In this paper, we introduce Category Theory as a formal foundation for model-based systems engineering. A generalised view of the system based on category theory is presented, where any system can be considered as a category. The objects

  18. Cost prediction model for various payloads and instruments for the Space Shuttle Orbiter

    Science.gov (United States)

    Hoffman, F. E.

    1984-01-01

    The following cost parameters of the space shuttle were undertaken: (1) to develop a cost prediction model for various payload classes of instruments and experiments for the Space Shuttle Orbiter; and (2) to show the implications of various payload classes on the cost of: reliability analysis, quality assurance, environmental design requirements, documentation, parts selection, and other reliability enhancing activities.

  19. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  20. Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories

    International Nuclear Information System (INIS)

    Wells, James

    2015-01-01

    The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyond what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more

  1. A new k-epsilon model consistent with Monin-Obukhov similarity theory

    DEFF Research Database (Denmark)

    van der Laan, Paul; Kelly, Mark C.; Sørensen, Niels N.

    2017-01-01

    A new k-" model is introduced that is consistent with Monin–Obukhov similarity theory (MOST). The proposed k-" model is compared with another k-" model that was developed in an attempt to maintain inlet profiles compatible with MOST. It is shown that the previous k-" model is not consistent with ...

  2. Compositional models and conditional independence in evidence theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim; Vejnarová, Jiřina

    2011-01-01

    Roč. 52, č. 3 (2011), s. 316-334 ISSN 0888-613X Institutional research plan: CEZ:AV0Z10750506 Keywords : Evidence theory * Conditional independence * multidimensional models Subject RIV: BA - General Mathematics Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-0370515.pdf

  3. Theory-based Bayesian models of inductive learning and reasoning.

    Science.gov (United States)

    Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

    2006-07-01

    Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

  4. Practical considerations in developing an instrument-maintenance plan--

    International Nuclear Information System (INIS)

    Guth, M.A.S.

    1989-01-01

    The author develops a general set of considerations to explain how a consistent, well-organized, prioritized, and adequate time-allowance program plan for routine maintenance can be constructed. The analysis is supplemented with experience from the high flux isotope reactor (HFIR) at US Oak Ridge National Laboratory (ORNL). After the preventive maintenance (PM) problem was defined, the instruments on the schedule were selected based on the manufacturer's design specifications, quality-assurance requirements, prior classifications, experiences with the incidence of breakdowns or calibration, and dependencies among instruments. The effects of repair error in PM should be also studied. The HFIR requires three full-time technicians to perform both PM and unscheduled maintenance. A review is presented of concepts from queuing theory to determine anticipated breakdown patterns. In practice, the pneumatic instruments have a much longer lifetime than the electric/electronic instruments on various reactors at ORNL. Some special considerations and risk aversion in choosing a maintenance schedule

  5. Nurses' intention to leave: critically analyse the theory of reasoned action and organizational commitment model.

    Science.gov (United States)

    Liou, Shwu-Ru

    2009-01-01

    To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.

  6. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  7. Categories of relations as models of quantum theory

    Directory of Open Access Journals (Sweden)

    Chris Heunen

    2015-11-01

    Full Text Available Categories of relations over a regular category form a family of models of quantum theory. Using regular logic, many properties of relations over sets lift to these models, including the correspondence between Frobenius structures and internal groupoids. Over compact Hausdorff spaces, this lifting gives continuous symmetric encryption. Over a regular Mal'cev category, this correspondence gives a characterization of categories of completely positive maps, enabling the formulation of quantum features. These models are closer to Hilbert spaces than relations over sets in several respects: Heisenberg uncertainty, impossibility of broadcasting, and behavedness of rank one morphisms.

  8. Modeling of MEMS piezoelectric energy harvesters using electromagnetic and power system theories

    International Nuclear Information System (INIS)

    Ahmad, Mahmoud Al; Alshareef, H N; Elshurafa, Amro M; Salama, Khaled N

    2011-01-01

    This work proposes a novel methodology for estimating the power output of piezoelectric generators. An analytical model that estimates for the first time the loss ratio and output power of piezoelectric generators based on the direct mechanical-to-electrical analogy, electromagnetic theory, and power system theory is developed. The mechanical-to-electrical analogy and power system theory allow the derivation of an equivalent input impedance expression for the network, whereas electromagnetic transmission line theory allows deduction of the equivalent electromechanical loss of the piezoelectric generator. By knowing the mechanical input power and the loss of the network, calculation of the output power of the piezoelectric device becomes a straightforward procedure. Experimental results based on published data are also presented to validate the analytical solution. In order to fully benefit from the well-established electromagnetic transmission line and electric circuit theories, further analyses on the resonant frequency, bandwidth, and sensitivity are presented. Compared to the conventional modeling methods currently being adopted in the literature, the proposed method provides significant additional information that is crucial for enhanced device operation and quick performance optimization

  9. A simple solvable model of quantum field theory of open strings

    International Nuclear Information System (INIS)

    Kazakov, V.A.; AN SSSR, Moscow

    1990-01-01

    A model of quantum field theory of open strings without any embedding (D=0) is solved. The world sheets of interacting strings are represented by dynamical planar graphs with dynamical holes of arbitrary sizes. The phenomenon of spontaneous tearing of the world sheet is noticed, which gives a singularity at zero coupling constant of string interaction. This phenomenon can be considered as a nonperturbative effect, similar to renormalons in planar field theories and is closely related to the α' → 0 limit of string field theories. (orig.)

  10. Theory, modeling, and integrated studies in the Arase (ERG) project

    Science.gov (United States)

    Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa

    2018-02-01

    Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.

  11. Application of Chaos Theory to Psychological Models

    Science.gov (United States)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  12. Pragmatic nihilism: how a Theory of Nothing can help health psychology progress.

    Science.gov (United States)

    Peters, Gjalt-Jorn Ygram; Crutzen, Rik

    2017-06-01

    Health psychology developed a plethora of theories to explain and change a wide variety of behaviours. Several attempts have been undertaken to build integrative theories, some even striving for a Theory of Everything. We argue against these efforts, arguing that instead a 'pragmatic nihilism' perspective may be more fruitful to understand and change health behaviours. The first tenet of pragmatic nihilism is that psychological variables are usefully considered as metaphors rather than referring to entities that exist in the mind. As a consequence, the second tenet emphasizes theories' definitions and guidelines for the operationalisation of those variables. The third tenet of pragmatic nihilism is that each operationalisation represents an intersection of a variety of dimensions, such as behavioural specificity and duration, and most importantly, psychological aggregation level. Any operationalisation thus represents a number of choices regarding these dimensions. Pragmatic nihilism has two implications. First, it provides a foundation that enables integrating theories in a more flexible and accurate manner than made possible by integrative theories. Second, it emphasizes the importance of operationalisations, underlining the importance of investing in the careful development of measurement instruments, thorough reporting of measurement instruments' specifics and performance, and full disclosure of the instruments themselves.

  13. Game theory at work: OR models and algorythms to solve multi-actor heterogeneous decision problems

    NARCIS (Netherlands)

    Sáiz Pérez, M.E.

    2007-01-01

    Key words: Game theory, operations research, optimisation methods, algorithms. The objective of this thesis is to explore the potential of combining Game Theory (GT) models with Operations Research (OR) modelling. This includes development of algorithms to solve these complex OR models for

  14. Towards an understanding of the large-U Hubbard model and a theory for high-temperature superconductors

    International Nuclear Information System (INIS)

    Hsu, T.C.T.

    1989-01-01

    This thesis describes work on a large-U Hubbard model theory for high temperature superconductors. After an introduction to recent developments in the field, the author reviews experimental results. At the same time he introduces the holon-spinon model and comment on its successes and shortcomings. Using this heuristic model he then describes a holon pairing theory of superconductivity and list some experimental evidence for this interlayer coupling theory. The latter part of the thesis is devoted to projected fermion mean field theories. They are introduced by applying this theory and some recently developed computational techniques to anisotropic antiferromagnets. This scheme is shown to give quantitatively good results for the two dimensional square lattice Heisenberg AFM. The results have definite implications for a spinon theory of quantum antiferromagnets. Finally he studies flux phases and other variational prescriptions for obtaining low lying states of the Hubbard model

  15. A model of the demand for Islamic banks debt-based financing instrument

    Science.gov (United States)

    Jusoh, Mansor; Khalid, Norlin

    2013-04-01

    This paper presents a theoretical analysis of the demand for debt-based financing instruments of the Islamic banks. Debt-based financing, such as through baibithamanajil and al-murabahah, is by far the most prominent of the Islamic bank financing and yet it has been largely ignored in Islamic economics literature. Most studies instead have been focusing on equity-based financing of al-mudharabah and al-musyarakah. Islamic bank offers debt-based financing through various instruments derived under the principle of exchange (ukud al-mu'awadhat) or more specifically, the contract of deferred sale. Under such arrangement, Islamic debt is created when goods are purchased and the payments are deferred. Thus, unlike debt of the conventional bank which is a form of financial loan contract to facilitate demand for liquid assets, this Islamic debt is created in response to the demand to purchase goods by deferred payment. In this paper we set an analytical framework that is based on an infinitely lived representative agent model (ILRA model) to analyze the demand for goods to be purchased by deferred payment. The resulting demand will then be used to derive the demand for Islamic debt. We also investigate theoretically, factors that may have an impact on the demand for Islamic debt.

  16. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    Science.gov (United States)

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners

  17. The Use of Modelling for Theory Building in Qualitative Analysis

    Science.gov (United States)

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  18. A Proposed Model of Jazz Theory Knowledge Acquisition

    Science.gov (United States)

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  19. Perception and Modeling of Affective Qualities of Musical Instrument Sounds across Pitch Registers.

    Science.gov (United States)

    McAdams, Stephen; Douglas, Chelsea; Vempala, Naresh N

    2017-01-01

    Composers often pick specific instruments to convey a given emotional tone in their music, partly due to their expressive possibilities, but also due to their timbres in specific registers and at given dynamic markings. Of interest to both music psychology and music informatics from a computational point of view is the relation between the acoustic properties that give rise to the timbre at a given pitch and the perceived emotional quality of the tone. Musician and nonmusician listeners were presented with 137 tones produced at a fixed dynamic marking (forte) playing tones at pitch class D# across each instrument's entire pitch range and with different playing techniques for standard orchestral instruments drawn from the brass, woodwind, string, and pitched percussion families. They rated each tone on six analogical-categorical scales in terms of emotional valence (positive/negative and pleasant/unpleasant), energy arousal (awake/tired), tension arousal (excited/calm), preference (like/dislike), and familiarity. Linear mixed models revealed interactive effects of musical training, instrument family, and pitch register, with non-linear relations between pitch register and several dependent variables. Twenty-three audio descriptors from the Timbre Toolbox were computed for each sound and analyzed in two ways: linear partial least squares regression (PLSR) and nonlinear artificial neural net modeling. These two analyses converged in terms of the importance of various spectral, temporal, and spectrotemporal audio descriptors in explaining the emotion ratings, but some differences also emerged. Different combinations of audio descriptors make major contributions to the three emotion dimensions, suggesting that they are carried by distinct acoustic properties. Valence is more positive with lower spectral slopes, a greater emergence of strong partials, and an amplitude envelope with a sharper attack and earlier decay. Higher tension arousal is carried by brighter sounds

  20. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

    Science.gov (United States)

    Matsumura, Koki; Kawamoto, Masaru

    This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.